[ 462.701794] env[59995]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 463.332528] env[60044]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 464.875423] env[60044]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60044) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 464.875754] env[60044]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60044) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 464.875870] env[60044]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60044) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 464.876139] env[60044]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 464.877257] env[60044]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 464.998994] env[60044]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60044) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 465.008972] env[60044]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=60044) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 465.110322] env[60044]: INFO nova.virt.driver [None req-5bb7fc99-6836-4460-b874-c3ae2a222c34 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 465.183355] env[60044]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 465.183544] env[60044]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 465.183613] env[60044]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60044) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 468.350772] env[60044]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-e568292f-fe7f-49b5-abc4-ac164dbe7007 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.366364] env[60044]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60044) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 468.366572] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-90b7b2be-7a40-4c40-86d8-4f26dbb54d27 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.400154] env[60044]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 7bfe1. [ 468.400360] env[60044]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.217s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.400884] env[60044]: INFO nova.virt.vmwareapi.driver [None req-5bb7fc99-6836-4460-b874-c3ae2a222c34 None None] VMware vCenter version: 7.0.3 [ 468.404359] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc131274-b4d3-4510-b620-b53d10e3ef1c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.421622] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4950579f-a6ba-41ed-9a4b-c9ac7175ed11 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.427402] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-178d3464-8f06-48b2-8ba9-2c3dfa9adcdf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.433881] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe9c3ab4-5910-4601-85b8-baf16f361f53 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.447359] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a6adee-91b2-486c-98cb-e3b8d16e40d7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.453153] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46dd46df-4fcc-4e10-a575-08237dc7717d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.482155] env[60044]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-e8231f72-ae74-463b-b628-bef4c9a05d96 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.487188] env[60044]: DEBUG nova.virt.vmwareapi.driver [None req-5bb7fc99-6836-4460-b874-c3ae2a222c34 None None] Extension org.openstack.compute already exists. {{(pid=60044) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 468.489780] env[60044]: INFO nova.compute.provider_config [None req-5bb7fc99-6836-4460-b874-c3ae2a222c34 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 468.507832] env[60044]: DEBUG nova.context [None req-5bb7fc99-6836-4460-b874-c3ae2a222c34 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe(cell1) {{(pid=60044) load_cells /opt/stack/nova/nova/context.py:464}} [ 468.509712] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.509963] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.510690] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.511077] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.511282] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.512252] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.524976] env[60044]: DEBUG oslo_db.sqlalchemy.engines [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60044) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 468.525381] env[60044]: DEBUG oslo_db.sqlalchemy.engines [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60044) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 468.531685] env[60044]: ERROR nova.db.main.api [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 468.531685] env[60044]: result = function(*args, **kwargs) [ 468.531685] env[60044]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 468.531685] env[60044]: return func(*args, **kwargs) [ 468.531685] env[60044]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 468.531685] env[60044]: result = fn(*args, **kwargs) [ 468.531685] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 468.531685] env[60044]: return f(*args, **kwargs) [ 468.531685] env[60044]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 468.531685] env[60044]: return db.service_get_minimum_version(context, binaries) [ 468.531685] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 468.531685] env[60044]: _check_db_access() [ 468.531685] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 468.531685] env[60044]: stacktrace = ''.join(traceback.format_stack()) [ 468.531685] env[60044]: [ 468.533161] env[60044]: ERROR nova.db.main.api [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 468.533161] env[60044]: result = function(*args, **kwargs) [ 468.533161] env[60044]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 468.533161] env[60044]: return func(*args, **kwargs) [ 468.533161] env[60044]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 468.533161] env[60044]: result = fn(*args, **kwargs) [ 468.533161] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 468.533161] env[60044]: return f(*args, **kwargs) [ 468.533161] env[60044]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 468.533161] env[60044]: return db.service_get_minimum_version(context, binaries) [ 468.533161] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 468.533161] env[60044]: _check_db_access() [ 468.533161] env[60044]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 468.533161] env[60044]: stacktrace = ''.join(traceback.format_stack()) [ 468.533161] env[60044]: [ 468.534274] env[60044]: WARNING nova.objects.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Failed to get minimum service version for cell e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe [ 468.534274] env[60044]: WARNING nova.objects.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 468.534274] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Acquiring lock "singleton_lock" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 468.534511] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Acquired lock "singleton_lock" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 468.534587] env[60044]: DEBUG oslo_concurrency.lockutils [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Releasing lock "singleton_lock" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 468.534859] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Full set of CONF: {{(pid=60044) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 468.535010] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ******************************************************************************** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 468.535169] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] Configuration options gathered from: {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 468.535258] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 468.535444] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 468.535568] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ================================================================================ {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 468.535768] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] allow_resize_to_same_host = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.535933] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] arq_binding_timeout = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536073] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] backdoor_port = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536200] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] backdoor_socket = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536360] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] block_device_allocate_retries = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536515] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] block_device_allocate_retries_interval = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536679] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cert = self.pem {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.536839] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537011] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute_monitors = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537185] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] config_dir = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537352] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] config_drive_format = iso9660 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537492] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537656] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] config_source = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537819] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] console_host = devstack {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.537979] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] control_exchange = nova {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538148] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cpu_allocation_ratio = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538304] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] daemon = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538466] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] debug = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538618] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] default_access_ip_network_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538777] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] default_availability_zone = nova {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.538928] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] default_ephemeral_format = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539321] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539321] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] default_schedule_zone = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539556] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] disk_allocation_ratio = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539626] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] enable_new_services = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539800] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] enabled_apis = ['osapi_compute'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.539978] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] enabled_ssl_apis = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.540166] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] flat_injected = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.540325] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] force_config_drive = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.540509] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] force_raw_images = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.540687] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] graceful_shutdown_timeout = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.540844] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] heal_instance_info_cache_interval = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541112] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] host = cpu-1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541299] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541460] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541625] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541827] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.541987] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_build_timeout = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542159] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_delete_interval = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542326] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_format = [instance: %(uuid)s] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542487] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_name_template = instance-%08x {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542642] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_usage_audit = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542806] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_usage_audit_period = month {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.542965] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.543145] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.543305] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] internal_service_availability_zone = internal {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.543483] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] key = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.543655] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] live_migration_retry_count = 30 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.543836] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_config_append = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544016] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544179] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_dir = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544333] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544458] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_options = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544614] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_rotate_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544778] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_rotate_interval_type = days {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.544937] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] log_rotation_type = none {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545073] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545228] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545427] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545623] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545757] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.545920] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] long_rpc_timeout = 1800 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546091] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_concurrent_builds = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546253] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_concurrent_live_migrations = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546462] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_concurrent_snapshots = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546648] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_local_block_devices = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546808] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_logfile_count = 30 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.546966] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] max_logfile_size_mb = 200 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547135] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] maximum_instance_delete_attempts = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547301] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metadata_listen = 0.0.0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547465] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metadata_listen_port = 8775 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547628] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metadata_workers = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547791] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] migrate_max_retries = -1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.547951] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] mkisofs_cmd = genisoimage {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548167] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548296] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] my_ip = 10.180.1.21 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548452] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] network_allocate_retries = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548623] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548784] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.548944] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] osapi_compute_listen_port = 8774 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549121] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] osapi_compute_unique_server_name_scope = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549288] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] osapi_compute_workers = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549440] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] password_length = 12 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549594] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] periodic_enable = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549748] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] periodic_fuzzy_delay = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.549909] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] pointer_model = usbtablet {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550122] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] preallocate_images = none {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550288] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] publish_errors = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550413] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] pybasedir = /opt/stack/nova {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550564] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ram_allocation_ratio = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550723] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rate_limit_burst = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.550878] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rate_limit_except_level = CRITICAL {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551085] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rate_limit_interval = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551261] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reboot_timeout = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551420] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reclaim_instance_interval = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551571] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] record = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551733] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reimage_timeout_per_gb = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.551893] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] report_interval = 120 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552059] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rescue_timeout = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552223] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reserved_host_cpus = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552406] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reserved_host_disk_mb = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552567] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reserved_host_memory_mb = 512 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552725] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] reserved_huge_pages = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.552881] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] resize_confirm_window = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553046] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] resize_fs_using_block_device = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553215] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] resume_guests_state_on_host_boot = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553392] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553549] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rpc_response_timeout = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553705] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] run_external_periodic_tasks = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.553865] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] running_deleted_instance_action = reap {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554029] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554192] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] running_deleted_instance_timeout = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554344] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler_instance_sync_interval = 120 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554474] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_down_time = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554636] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] servicegroup_driver = db {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554793] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] shelved_offload_time = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.554946] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] shelved_poll_interval = 3600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555126] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] shutdown_timeout = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555279] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] source_is_ipv6 = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555432] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ssl_only = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555675] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555839] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] sync_power_state_interval = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.555995] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] sync_power_state_pool_size = 1000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556172] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] syslog_log_facility = LOG_USER {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556324] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] tempdir = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556476] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] timeout_nbd = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556636] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] transport_url = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556789] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] update_resources_interval = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.556942] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_cow_images = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557108] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_eventlog = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557262] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_journal = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557414] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_json = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557564] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_rootwrap_daemon = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557713] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_stderr = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.557863] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] use_syslog = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558018] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vcpu_pin_set = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558184] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plugging_is_fatal = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558345] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plugging_timeout = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558502] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] virt_mkfs = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558656] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] volume_usage_poll_interval = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558812] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] watch_log_file = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.558976] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] web = /usr/share/spice-html5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 468.559174] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_concurrency.disable_process_locking = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.559452] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.559627] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.559787] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.559995] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.560167] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.560331] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.560509] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.auth_strategy = keystone {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.560673] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.compute_link_prefix = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.560850] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561044] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.dhcp_domain = novalocal {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561231] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.enable_instance_password = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561393] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.glance_link_prefix = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561551] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561717] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.561874] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.instance_list_per_project_cells = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562041] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.list_records_by_skipping_down_cells = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562206] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.local_metadata_per_cell = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562400] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.max_limit = 1000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562576] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.metadata_cache_expiration = 15 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562746] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.neutron_default_tenant_id = default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.562908] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.use_forwarded_for = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563083] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.use_neutron_default_nets = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563253] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563414] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563575] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563744] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.563911] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_dynamic_targets = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564089] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_jsonfile_path = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564270] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564461] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.backend = dogpile.cache.memcached {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564624] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.backend_argument = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564793] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.config_prefix = cache.oslo {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.564954] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.dead_timeout = 60.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565129] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.debug_cache_backend = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565287] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.enable_retry_client = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565443] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.enable_socket_keepalive = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565607] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.enabled = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565767] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.expiration_time = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.565924] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.hashclient_retry_attempts = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566096] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566256] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_dead_retry = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566420] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_password = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566581] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566739] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.566899] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_pool_maxsize = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567069] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567237] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_sasl_enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567413] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567577] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567745] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.memcache_username = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.567912] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.proxies = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568082] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.retry_attempts = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568251] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.retry_delay = 0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568418] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.socket_keepalive_count = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568570] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.socket_keepalive_idle = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568727] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.socket_keepalive_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.568882] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.tls_allowed_ciphers = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569049] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.tls_cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569205] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.tls_certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569362] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.tls_enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569515] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cache.tls_keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569682] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.569856] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.auth_type = password {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570046] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570231] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570391] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570553] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570713] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.cross_az_attach = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.570876] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.debug = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571056] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.endpoint_template = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571238] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.http_retries = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571403] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571560] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571732] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.os_region_name = RegionOne {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.571892] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572063] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cinder.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572276] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572433] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.cpu_dedicated_set = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572581] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.cpu_shared_set = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572743] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.image_type_exclude_list = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.572903] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573075] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573239] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573397] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573561] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573721] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.resource_provider_association_refresh = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.573881] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.shutdown_retry_interval = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574068] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574249] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] conductor.workers = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574445] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] console.allowed_origins = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574610] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] console.ssl_ciphers = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574778] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] console.ssl_minimum_version = default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.574949] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] consoleauth.token_ttl = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575176] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575292] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575453] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575608] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575763] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.575915] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576097] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576258] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576414] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576569] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576723] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.576878] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577055] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.service_type = accelerator {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577220] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577373] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577525] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577680] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.577857] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578026] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] cyborg.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578212] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.backend = sqlalchemy {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578395] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.connection = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578557] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.connection_debug = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578722] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.connection_parameters = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.578883] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.connection_recycle_time = 3600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579057] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.connection_trace = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579224] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.db_inc_retry_interval = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579383] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.db_max_retries = 20 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579541] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.db_max_retry_interval = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579699] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.db_retry_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.579873] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.max_overflow = 50 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580074] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.max_pool_size = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580255] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.max_retries = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580420] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.mysql_enable_ndb = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580588] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580747] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.mysql_wsrep_sync_wait = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.580906] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.pool_timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581102] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.retry_interval = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581277] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.slave_connection = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581444] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.sqlite_synchronous = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581603] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] database.use_db_reconnect = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581779] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.backend = sqlalchemy {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.581955] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.connection = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582135] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.connection_debug = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582329] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.connection_parameters = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582504] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.connection_recycle_time = 3600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582670] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.connection_trace = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582830] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.db_inc_retry_interval = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.582992] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.db_max_retries = 20 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583168] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.db_max_retry_interval = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583329] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.db_retry_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583494] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.max_overflow = 50 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583652] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.max_pool_size = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583815] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.max_retries = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.583976] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.mysql_enable_ndb = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584157] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584316] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584475] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.pool_timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584640] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.retry_interval = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584795] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.slave_connection = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.584958] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] api_database.sqlite_synchronous = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.585146] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] devices.enabled_mdev_types = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.585322] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.585482] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ephemeral_storage_encryption.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.585642] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.585811] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.api_servers = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.587424] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.587621] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.587797] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.587966] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588145] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588312] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.debug = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588479] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.default_trusted_certificate_ids = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588645] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.enable_certificate_validation = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588811] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.enable_rbd_download = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.588972] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589156] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589321] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589480] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589638] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589799] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.num_retries = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.589988] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.rbd_ceph_conf = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.590179] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.rbd_connect_timeout = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.590352] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.rbd_pool = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.590519] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.rbd_user = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.590681] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.590841] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591018] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.service_type = image {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591210] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591373] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591531] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591689] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.591871] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592047] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.verify_glance_signatures = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592212] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] glance.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592423] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] guestfs.debug = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592604] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.config_drive_cdrom = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592769] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.config_drive_inject_password = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.592935] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593114] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593278] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.enable_remotefx = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593447] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.instances_path_share = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593611] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.iscsi_initiator_list = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593770] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.limit_cpu_features = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.593932] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594106] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594276] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594436] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594601] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594762] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.use_multipath_io = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.594923] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.595093] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.595253] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.vswitch_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.595411] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.595575] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] mks.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.595928] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596134] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.manager_interval = 2400 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596305] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.precache_concurrency = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596475] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.remove_unused_base_images = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596643] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596807] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.596981] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] image_cache.subdirectory_name = _base {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597168] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.api_max_retries = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597331] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.api_retry_interval = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597488] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597645] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.auth_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597800] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.597955] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598131] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598289] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598482] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598654] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598817] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.598973] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599143] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599297] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599450] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.partition_key = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599610] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.peer_list = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599763] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.599929] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.serial_console_state_timeout = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600121] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600294] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.service_type = baremetal {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600454] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600607] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600763] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.600947] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601124] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601300] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ironic.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601480] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601650] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] key_manager.fixed_key = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601827] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.601988] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.barbican_api_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.602162] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.barbican_endpoint = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.602354] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.barbican_endpoint_type = public {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.602527] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.barbican_region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.602686] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.602843] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603009] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603177] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603332] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603490] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.number_of_retries = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603649] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.retry_delay = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603807] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.send_service_user_token = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.603965] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.604134] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.verify_ssl = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican.verify_ssl_path = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.auth_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605222] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605506] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605506] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605637] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605792] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] barbican_service_user.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.605954] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.approle_role_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606121] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.approle_secret_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606276] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606428] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606585] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606741] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.606894] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607071] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.kv_mountpoint = secret {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607237] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.kv_version = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607393] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.namespace = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607549] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.root_token_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607709] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.607861] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.ssl_ca_crt_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608026] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608191] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.use_ssl = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608356] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608520] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608682] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608842] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.608998] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609170] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609325] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609482] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609636] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609791] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.609963] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610148] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610309] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610501] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.service_type = identity {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610679] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610840] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.610998] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.611200] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.611387] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.611550] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] keystone.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.611750] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.connection_uri = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.611912] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_mode = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612088] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612263] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_models = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612435] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_power_governor_high = performance {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612603] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612767] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_power_management = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.612939] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613118] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.device_detach_attempts = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613284] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.device_detach_timeout = 20 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613450] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.disk_cachemodes = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613607] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.disk_prefix = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613770] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.enabled_perf_events = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.613934] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.file_backed_memory = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614109] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.gid_maps = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614267] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.hw_disk_discard = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614422] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.hw_machine_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614588] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_rbd_ceph_conf = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614752] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.614915] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615091] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_rbd_glance_store_name = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615263] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_rbd_pool = rbd {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615433] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_type = default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615590] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.images_volume_group = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615753] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.inject_key = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.615913] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.inject_partition = -2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616084] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.inject_password = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616248] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.iscsi_iface = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616407] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.iser_use_multipath = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616567] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616724] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.616887] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_downtime = 500 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617060] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617228] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617385] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_inbound_addr = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617545] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617704] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.617876] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_scheme = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618064] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_timeout_action = abort {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618234] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_tunnelled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618394] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_uri = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618556] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.live_migration_with_native_tls = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618715] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.max_queues = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.618878] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.619043] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.nfs_mount_options = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.619350] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.619523] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.619688] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.619849] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.620048] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.620228] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_pcie_ports = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.620395] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.620561] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.pmem_namespaces = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.620722] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.quobyte_client_cfg = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621018] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621224] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621395] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621559] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621717] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rbd_secret_uuid = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.621872] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rbd_user = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622044] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622223] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622415] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rescue_image_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622579] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rescue_kernel_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622735] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rescue_ramdisk_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.622902] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.623069] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.rx_queue_size = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.623242] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.smbfs_mount_options = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.623510] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.623680] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.snapshot_compression = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.623840] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.snapshot_image_format = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624067] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624238] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.sparse_logical_volumes = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624400] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.swtpm_enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624567] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.swtpm_group = tss {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624734] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.swtpm_user = tss {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.624902] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.sysinfo_serial = unique {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625069] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.tx_queue_size = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625237] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.uid_maps = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625398] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.use_virtio_for_bridges = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625566] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.virt_type = kvm {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625730] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.volume_clear = zero {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.625889] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.volume_clear_size = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626062] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.volume_use_multipath = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626223] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_cache_path = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626389] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626554] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626715] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.626878] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.627160] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.627338] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.vzstorage_mount_user = stack {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.627502] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.627671] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.627841] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.auth_type = password {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628010] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628173] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628335] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628492] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628649] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628817] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.default_floating_pool = public {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.628974] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629151] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.extension_sync_interval = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629311] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.http_retries = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629470] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629626] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629778] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.629961] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630150] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630321] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.ovs_bridge = br-int {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630485] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.physnets = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630649] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.region_name = RegionOne {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630816] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.service_metadata_proxy = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.630975] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631178] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.service_type = network {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631345] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631502] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631656] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631811] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.631988] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.632166] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] neutron.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.632373] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] notifications.bdms_in_notifications = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.632563] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] notifications.default_level = INFO {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.632738] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] notifications.notification_format = unversioned {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.632902] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] notifications.notify_on_state_change = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633087] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633269] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] pci.alias = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633436] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] pci.device_spec = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633597] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] pci.report_in_placement = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633768] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.633937] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.auth_type = password {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634116] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634276] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634433] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634588] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634742] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.634896] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635063] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.default_domain_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635225] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.default_domain_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635378] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.domain_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635533] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.domain_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635686] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635844] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.635999] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636168] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636322] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636487] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.password = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636642] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.project_domain_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636805] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.project_domain_name = Default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.636968] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.project_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637151] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.project_name = service {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637320] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.region_name = RegionOne {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637476] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637640] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.service_type = placement {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637800] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.637954] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638123] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638278] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.system_scope = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638432] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638585] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.trust_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638739] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.user_domain_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.638902] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.user_domain_name = Default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639069] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.user_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639245] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.username = placement {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639423] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639581] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] placement.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639754] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.cores = 20 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.639914] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.count_usage_from_placement = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640120] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640301] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.injected_file_content_bytes = 10240 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640464] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.injected_file_path_length = 255 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640627] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.injected_files = 5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640789] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.instances = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.640955] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.key_pairs = 100 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641150] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.metadata_items = 128 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641323] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.ram = 51200 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641486] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.recheck_quota = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641649] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.server_group_members = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641814] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] quota.server_groups = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.641981] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rdp.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.642318] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.642519] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.642687] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.642850] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.image_metadata_prefilter = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643018] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643185] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.max_attempts = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643349] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.max_placement_results = 1000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643511] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643669] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.query_placement_for_availability_zone = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643828] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.643984] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.644171] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] scheduler.workers = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.644353] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.644540] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.644720] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.644890] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645072] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645240] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645401] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645586] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645752] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.host_subset_size = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.645911] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646085] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646254] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.isolated_hosts = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646417] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.isolated_images = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646578] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646737] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.646894] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.pci_in_placement = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647063] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647234] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647396] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647556] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647712] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.647874] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648045] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.track_instance_changes = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648228] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648396] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metrics.required = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648557] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metrics.weight_multiplier = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648716] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.648878] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] metrics.weight_setting = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.649181] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.649358] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.649534] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.port_range = 10000:20000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.649702] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.649868] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650069] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] serial_console.serialproxy_port = 6083 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650251] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650423] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.auth_type = password {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650581] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650737] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.650897] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651073] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651231] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651399] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.send_service_user_token = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651587] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651734] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] service_user.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.651901] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.agent_enabled = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.652087] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.652408] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.652605] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.652782] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.html5proxy_port = 6082 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.652945] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.image_compression = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653117] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.jpeg_compression = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653283] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.playback_compression = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653465] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.server_listen = 127.0.0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653636] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653794] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.streaming_mode = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.653950] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] spice.zlib_compression = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654130] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] upgrade_levels.baseapi = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654291] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] upgrade_levels.cert = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654461] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] upgrade_levels.compute = auto {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654619] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] upgrade_levels.conductor = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654783] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] upgrade_levels.scheduler = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.654952] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655127] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655285] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655441] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655599] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655756] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.655911] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656080] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656241] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vendordata_dynamic_auth.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656440] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.api_retry_count = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656620] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.ca_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656794] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.656961] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.cluster_name = testcl1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657138] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.connection_pool_size = 10 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657296] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.console_delay_seconds = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657462] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.datastore_regex = ^datastore.* {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657660] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657832] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.host_password = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.657994] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.host_port = 443 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.658173] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.host_username = administrator@vsphere.local {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.658339] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.insecure = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.658544] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.integration_bridge = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.658728] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.maximum_objects = 100 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.658888] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.pbm_default_policy = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659072] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.pbm_enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659235] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.pbm_wsdl_location = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659403] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659562] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.serial_port_proxy_uri = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659718] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.serial_port_service_uri = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.659882] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.task_poll_interval = 0.5 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660091] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.use_linked_clone = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660273] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.vnc_keymap = en-us {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660442] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.vnc_port = 5900 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660605] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vmware.vnc_port_total = 10000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660793] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.auth_schemes = ['none'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.660978] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.661274] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.661459] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.661631] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.novncproxy_port = 6080 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.661808] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.server_listen = 127.0.0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.661981] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662159] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.vencrypt_ca_certs = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662333] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.vencrypt_client_cert = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662491] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vnc.vencrypt_client_key = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662670] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662835] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_deep_image_inspection = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.662996] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663172] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663331] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663490] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.disable_rootwrap = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663646] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.enable_numa_live_migration = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663802] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.663960] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664133] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664293] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.libvirt_disable_apic = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664453] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664611] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664769] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.664930] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665103] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665265] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665423] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665585] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665739] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.665901] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666094] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666266] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.client_socket_timeout = 900 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666431] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.default_pool_size = 1000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666595] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.keep_alive = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666759] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.max_header_line = 16384 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.666919] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667095] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.ssl_ca_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667253] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.ssl_cert_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667411] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.ssl_key_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667570] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.tcp_keepidle = 600 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667739] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.667902] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] zvm.ca_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.668098] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] zvm.cloud_connector_url = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.668383] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.668555] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] zvm.reachable_timeout = 300 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.668733] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.enforce_new_defaults = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.668901] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.enforce_scope = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669096] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.policy_default_rule = default {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669267] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669439] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.policy_file = policy.yaml {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669608] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669764] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.669969] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670127] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670295] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670464] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670638] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670812] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.connection_string = messaging:// {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.670978] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.enabled = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671159] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.es_doc_type = notification {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671314] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.es_scroll_size = 10000 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671477] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.es_scroll_time = 2m {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671637] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.filter_error_trace = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671802] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.hmac_keys = SECRET_KEY {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.671965] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.sentinel_service_name = mymaster {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672147] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.socket_timeout = 0.1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672326] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] profiler.trace_sqlalchemy = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672498] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] remote_debug.host = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672656] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] remote_debug.port = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672829] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.672989] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673166] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673324] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673482] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673639] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673797] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.673956] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674130] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674286] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674455] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674620] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674787] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.674950] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675126] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675299] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675460] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675621] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675784] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.675945] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676119] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676284] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676443] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676600] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676763] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.676927] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.677122] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.677296] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.677458] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.677626] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.677795] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678022] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678211] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_notifications.retry = -1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678406] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678581] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678752] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.auth_section = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.678916] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.auth_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679093] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.cafile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679253] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.certfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679413] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.collect_timing = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679570] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.connect_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679726] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.connect_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.679880] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.endpoint_id = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680075] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.endpoint_override = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680247] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.insecure = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680404] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.keyfile = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680557] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.max_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680714] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.min_version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.680867] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.region_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681033] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.service_name = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681198] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.service_type = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681359] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.split_loggers = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681515] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.status_code_retries = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681671] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.status_code_retry_delay = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681827] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.timeout = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.681985] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.valid_interfaces = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682158] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_limit.version = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682323] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_reports.file_event_handler = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682483] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682641] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] oslo_reports.log_dir = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682809] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.682966] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683137] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683301] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683460] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683617] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683783] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.683940] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.group = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684109] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684275] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684434] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684590] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] vif_plug_ovs_privileged.user = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684757] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.684933] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685115] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685285] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685452] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685617] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685780] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.685941] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686130] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.isolate_vif = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686301] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686467] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686635] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686808] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.686971] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_vif_ovs.per_port_bridge = False {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687150] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_brick.lock_path = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687313] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687472] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687639] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.capabilities = [21] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687796] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.group = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.687967] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.helper_command = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688160] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688325] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688483] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] privsep_osbrick.user = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688654] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688814] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.group = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.688970] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.helper_command = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.689147] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.689308] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.689464] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] nova_sys_admin.user = None {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 468.689590] env[60044]: DEBUG oslo_service.service [None req-61106e2c-fffb-4621-a11a-61cd57328491 None None] ******************************************************************************** {{(pid=60044) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 468.690021] env[60044]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 468.698540] env[60044]: INFO nova.virt.node [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Generated node identity f00c8c1a-f294-46ac-89cc-95e9e57a7dca [ 468.698768] env[60044]: INFO nova.virt.node [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Wrote node identity f00c8c1a-f294-46ac-89cc-95e9e57a7dca to /opt/stack/data/n-cpu-1/compute_id [ 468.709644] env[60044]: WARNING nova.compute.manager [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Compute nodes ['f00c8c1a-f294-46ac-89cc-95e9e57a7dca'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 468.741639] env[60044]: INFO nova.compute.manager [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 468.763204] env[60044]: WARNING nova.compute.manager [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 468.763436] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.763635] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.763775] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.763930] env[60044]: DEBUG nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 468.765008] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-409e7724-1c82-4043-a40f-d1ed1df859ae {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.774013] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76ad5300-8685-4300-9d54-a6fe331b7299 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.787634] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77a77dcc-963b-4312-a705-3c368e3c5902 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.793908] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cadc6ee1-997c-4047-8ada-aa6106b5563a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 468.822370] env[60044]: DEBUG nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181268MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 468.822548] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.822704] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.834094] env[60044]: WARNING nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] No compute node record for cpu-1:f00c8c1a-f294-46ac-89cc-95e9e57a7dca: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host f00c8c1a-f294-46ac-89cc-95e9e57a7dca could not be found. [ 468.846323] env[60044]: INFO nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: f00c8c1a-f294-46ac-89cc-95e9e57a7dca [ 468.895230] env[60044]: DEBUG nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 468.895441] env[60044]: DEBUG nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 468.994772] env[60044]: INFO nova.scheduler.client.report [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] [req-b7cb5c9a-672f-4582-b71c-2f81ae65e2b6] Created resource provider record via placement API for resource provider with UUID f00c8c1a-f294-46ac-89cc-95e9e57a7dca and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 469.010327] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-794b180d-f9d6-4e65-9992-fb478985f6fb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 469.017849] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c2e629c-c74a-46bc-a1bb-2777e521fb96 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 469.047164] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6a46f3f-cad0-4a35-8970-a45e8f957f48 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 469.054258] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1daa3d35-f312-411c-8363-6249e51a2ed0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 469.067209] env[60044]: DEBUG nova.compute.provider_tree [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Updating inventory in ProviderTree for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 469.103066] env[60044]: DEBUG nova.scheduler.client.report [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Updated inventory for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 469.103308] env[60044]: DEBUG nova.compute.provider_tree [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Updating resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca generation from 0 to 1 during operation: update_inventory {{(pid=60044) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 469.103453] env[60044]: DEBUG nova.compute.provider_tree [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Updating inventory in ProviderTree for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 469.168980] env[60044]: DEBUG nova.compute.provider_tree [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Updating resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca generation from 1 to 2 during operation: update_traits {{(pid=60044) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 469.185585] env[60044]: DEBUG nova.compute.resource_tracker [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 469.185777] env[60044]: DEBUG oslo_concurrency.lockutils [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 469.185916] env[60044]: DEBUG nova.service [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Creating RPC server for service compute {{(pid=60044) start /opt/stack/nova/nova/service.py:182}} [ 469.198312] env[60044]: DEBUG nova.service [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] Join ServiceGroup membership for this service compute {{(pid=60044) start /opt/stack/nova/nova/service.py:199}} [ 469.198513] env[60044]: DEBUG nova.servicegroup.drivers.db [None req-60c3dbad-f112-478e-af76-6d6052001491 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60044) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 505.200625] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_power_states {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 505.211643] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Getting list of instances from cluster (obj){ [ 505.211643] env[60044]: value = "domain-c8" [ 505.211643] env[60044]: _type = "ClusterComputeResource" [ 505.211643] env[60044]: } {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 505.212836] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f5fb27e-a76b-4f8d-9a12-0ac7aed0755f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.223859] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Got total of 0 instances {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 505.224103] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 505.224445] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Getting list of instances from cluster (obj){ [ 505.224445] env[60044]: value = "domain-c8" [ 505.224445] env[60044]: _type = "ClusterComputeResource" [ 505.224445] env[60044]: } {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 505.225324] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16c0662d-4b3b-4d94-a0aa-14758352c3b1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.233074] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Got total of 0 instances {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 512.971438] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.971712] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.995731] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 513.103622] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 513.103874] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 513.105584] env[60044]: INFO nova.compute.claims [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 513.254638] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62115b55-b1a1-4088-8df8-61d718f62950 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.266319] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0cb3f8-4e72-49ee-b162-4420b6d6a454 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.303733] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e86ce010-0964-4b47-908e-d539e91566cd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.311713] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edc55f2d-2b6a-4bd3-981a-e55eb6a0654a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.327995] env[60044]: DEBUG nova.compute.provider_tree [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 513.336443] env[60044]: DEBUG nova.scheduler.client.report [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 513.354753] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 513.355433] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 513.412541] env[60044]: DEBUG nova.compute.utils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 513.413884] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 513.417018] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 513.432253] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 513.522079] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 513.865975] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 513.866104] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 513.866192] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 513.866409] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 513.866609] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 513.866771] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 513.866980] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 513.867160] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 513.867503] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 513.867675] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 513.867846] env[60044]: DEBUG nova.virt.hardware [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 513.868788] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a95c377e-d9be-4201-958a-a42757abb929 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.876876] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b08e67f-e74e-49aa-915f-b1995fdf18a0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 513.892801] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-200846d8-01f8-4515-9e9d-c0ea714030e9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 514.220045] env[60044]: DEBUG nova.policy [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c27275e1557427f82256d40fed0934e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96139b5146b84a968b5a8e9c51ada438', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 515.148272] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "db1dd823-8349-4f34-9a8e-ecec90bd105b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.150605] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "db1dd823-8349-4f34-9a8e-ecec90bd105b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.168112] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 515.218244] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.218501] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.219967] env[60044]: INFO nova.compute.claims [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 515.344160] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcab74fe-9a47-4b9d-8567-fa9c1d08333d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.357594] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32aab5e3-07c5-42e2-9e5b-fa1668d65bd3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.399455] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0692f6d1-b960-4c56-a553-658392753c30 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.414046] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf1ceb63-d421-45f9-aba3-f80eb6ddbd0c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.434050] env[60044]: DEBUG nova.compute.provider_tree [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 515.450892] env[60044]: DEBUG nova.scheduler.client.report [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 515.478334] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.478908] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 515.539315] env[60044]: DEBUG nova.compute.utils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 515.540826] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 515.541070] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 515.554625] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 515.658671] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 515.696636] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 515.700221] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 515.700498] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 515.700730] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 515.700849] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 515.700989] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 515.701231] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 515.701404] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 515.701649] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 515.701722] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 515.701889] env[60044]: DEBUG nova.virt.hardware [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 515.703295] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74fe53c1-52f4-4ab0-b6f7-a165e8336477 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.712471] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf795f48-983b-4ab4-9b10-f3c5d826fcbe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 515.770528] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.770853] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.779588] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "4e62d785-7c74-4d3a-9446-e690822d5386" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.779588] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "4e62d785-7c74-4d3a-9446-e690822d5386" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.793746] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 515.797209] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 515.825978] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Successfully created port: 26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 515.871469] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.871869] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.873255] env[60044]: INFO nova.compute.claims [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 515.881086] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 516.016423] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-051b46a4-dda7-463b-99cd-e2cc514dd861 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.024255] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff530c4a-53d0-49c5-92e4-89d31a305633 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.058473] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c0616f9-d7c8-405a-81d5-da2d6d02f63f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.067184] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e38da6-8db8-4389-bebe-e30911f47ca4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.081911] env[60044]: DEBUG nova.compute.provider_tree [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 516.088227] env[60044]: DEBUG nova.policy [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b98ad87c6bf548458598d21c0d163b57', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8331e6e26314ee3bd30dd7f6494daf4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 516.092120] env[60044]: DEBUG nova.scheduler.client.report [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 516.111751] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 516.112240] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 516.115235] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.234s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 516.116600] env[60044]: INFO nova.compute.claims [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 516.180910] env[60044]: DEBUG nova.compute.utils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 516.181930] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 516.182532] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 516.198758] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 516.272407] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32d9267f-7b05-49bf-8368-e7917e356a79 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.283749] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c8d3e9f-7cbc-4001-8661-7b5074f9c8d0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.319568] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92af7a93-59d6-4dae-9772-b035e390a713 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.322947] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 516.334310] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-777a6177-e8dc-404c-ab06-6619bd161096 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.349230] env[60044]: DEBUG nova.compute.provider_tree [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 516.357161] env[60044]: DEBUG nova.scheduler.client.report [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 516.364095] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 516.364095] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 516.364095] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 516.364387] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 516.364387] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 516.364387] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 516.364387] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 516.364387] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 516.364535] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 516.364535] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 516.364535] env[60044]: DEBUG nova.virt.hardware [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 516.365471] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-356eb8f0-a80e-43d4-8122-280a81116df4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.372364] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 516.372829] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 516.379635] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a31237-eb55-4703-b90e-3ce5f2ebadfe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.410449] env[60044]: DEBUG nova.compute.utils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 516.411695] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 516.411855] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 516.424259] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 516.513515] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 516.541828] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 516.541990] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 516.542170] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 516.542356] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 516.542538] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 516.542684] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 516.542889] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 516.543235] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 516.543324] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 516.543479] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 516.543652] env[60044]: DEBUG nova.virt.hardware [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 516.544526] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e78e8d02-88c9-4636-b2d2-4a9eca171c77 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.552747] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b09f7c52-0665-49d0-bc9f-7a8073e72aa1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 516.814828] env[60044]: DEBUG nova.policy [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b20a4b99c3041d986483e1c4d1cbe79', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a07d0346e8884cf394bb87ea702ec039', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 517.011280] env[60044]: DEBUG nova.policy [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '588d0c5d584544c3be2d880de2c00a37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7913858bdbbe4375917c0e1864ee8d2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 518.183812] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "23984fc7-95de-43c3-a21e-894fab241dce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.184180] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.203491] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 518.275847] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.276431] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.281524] env[60044]: INFO nova.compute.claims [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 518.451908] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7952e637-c4d9-4fcd-befd-07fc0410b465 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.460247] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8af79022-6827-41dd-a56e-db6afcbc0776 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.491928] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88df5512-6b1f-43c7-ab53-f2f0ec24bf74 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.501010] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd3dc0fc-d51a-49f0-81e3-e3b969021a4d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.514213] env[60044]: DEBUG nova.compute.provider_tree [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 518.527364] env[60044]: DEBUG nova.scheduler.client.report [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 518.541410] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.542078] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 518.594277] env[60044]: DEBUG nova.compute.utils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 518.596373] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 518.596373] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 518.616559] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 518.694734] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 518.724838] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 518.725606] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 518.725606] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 518.725773] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 518.725996] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 518.726219] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 518.726868] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 518.726868] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 518.726868] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 518.726979] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 518.727236] env[60044]: DEBUG nova.virt.hardware [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 518.728712] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d693b6bb-5281-4a72-a102-3b110415b4f8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.743898] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7cea73b-75c4-4724-9bfa-4a76fb774129 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.921092] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Successfully created port: e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 519.208288] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "a93c0169-490e-4cd2-b890-5e1d8aecae59" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.208546] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "a93c0169-490e-4cd2-b890-5e1d8aecae59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.229120] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 519.232264] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Successfully updated port: 26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 519.263770] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 519.263950] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquired lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 519.264123] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 519.295953] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.296217] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.297660] env[60044]: INFO nova.compute.claims [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 519.477149] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b2a764-1306-4c88-befd-12b3c66fbca6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.493708] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 519.498786] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c64cdf78-7399-4ba5-8578-5f64c6dfcabb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.534614] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78188605-5c38-45dd-a015-83aae1ce468a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.543259] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eedf568-e9b9-47b6-9314-7925d4dfc38a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.556983] env[60044]: DEBUG nova.compute.provider_tree [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 519.570226] env[60044]: DEBUG nova.scheduler.client.report [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 519.591415] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.592026] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 519.640495] env[60044]: DEBUG nova.compute.utils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 519.645768] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Not allocating networking since 'none' was specified. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 519.671198] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 519.702214] env[60044]: DEBUG nova.policy [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c175fa3bb59a42898f1fe8ea193beb7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d9184500ed74ad7bee0d5616a6dc843', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 519.776713] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 519.783392] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Successfully created port: 96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 519.806249] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 519.806249] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 519.806249] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 519.806417] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 519.806417] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 519.806536] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 519.807277] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 519.807277] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 519.807277] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 519.807277] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 519.807512] env[60044]: DEBUG nova.virt.hardware [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 519.808260] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29cc2ca5-9d16-423f-8003-66dddaea66ae {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.818977] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c2f7e55-611c-4026-a38a-c1d0e4b3983f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.837150] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Instance VIF info [] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 519.849638] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 519.850668] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Successfully created port: 95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 519.854718] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d31ae2e0-31a8-4cfa-8cd4-2ecddd65f1cc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.867980] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Created folder: OpenStack in parent group-v4. [ 519.868543] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating folder: Project (da83cfec83084d7bbcc161d5d7b287c4). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 519.869660] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-451c3f70-f709-4e44-b81d-42aaddb381ba {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.880326] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Created folder: Project (da83cfec83084d7bbcc161d5d7b287c4) in parent group-v449562. [ 519.880408] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating folder: Instances. Parent ref: group-v449563. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 519.880635] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e2aa0458-6898-4cd2-8d2b-473775151c25 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.893669] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Created folder: Instances in parent group-v449563. [ 519.893960] env[60044]: DEBUG oslo.service.loopingcall [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 519.894380] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 519.894380] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1836d2a6-fb90-4a88-8d02-896a8850935b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.916735] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 519.916735] env[60044]: value = "task-2204678" [ 519.916735] env[60044]: _type = "Task" [ 519.916735] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 519.927121] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204678, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 520.113122] env[60044]: DEBUG nova.compute.manager [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Received event network-vif-plugged-26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 520.113348] env[60044]: DEBUG oslo_concurrency.lockutils [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] Acquiring lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 520.113614] env[60044]: DEBUG oslo_concurrency.lockutils [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] Lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 520.113706] env[60044]: DEBUG oslo_concurrency.lockutils [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] Lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 520.113864] env[60044]: DEBUG nova.compute.manager [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] No waiting events found dispatching network-vif-plugged-26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 520.114134] env[60044]: WARNING nova.compute.manager [req-b042827c-404f-4999-a10e-4ff671b25772 req-ede362d9-dbcc-435d-b965-9cc318837c93 service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Received unexpected event network-vif-plugged-26a18662-8fba-4d77-a530-f366d7c04bd8 for instance with vm_state building and task_state spawning. [ 520.428685] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204678, 'name': CreateVM_Task, 'duration_secs': 0.263652} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 520.428685] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 520.429796] env[60044]: DEBUG oslo_vmware.service [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ad0a2f-3752-4c30-a470-36621b607e42 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.438899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 520.438899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 520.438899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 520.438899] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6595e46f-0cbc-4a18-8dfb-0a181f150af8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.444984] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for the task: (returnval){ [ 520.444984] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]522215ff-3312-7ea0-4be9-d2e6c69625a0" [ 520.444984] env[60044]: _type = "Task" [ 520.444984] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 520.454518] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]522215ff-3312-7ea0-4be9-d2e6c69625a0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 520.684489] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Updating instance_info_cache with network_info: [{"id": "26a18662-8fba-4d77-a530-f366d7c04bd8", "address": "fa:16:3e:d9:8b:b8", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap26a18662-8f", "ovs_interfaceid": "26a18662-8fba-4d77-a530-f366d7c04bd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 520.719369] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Releasing lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 520.719369] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance network_info: |[{"id": "26a18662-8fba-4d77-a530-f366d7c04bd8", "address": "fa:16:3e:d9:8b:b8", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap26a18662-8f", "ovs_interfaceid": "26a18662-8fba-4d77-a530-f366d7c04bd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 520.719722] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d9:8b:b8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '26a18662-8fba-4d77-a530-f366d7c04bd8', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 520.733539] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Creating folder: Project (96139b5146b84a968b5a8e9c51ada438). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 520.735659] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-720ce795-9297-423a-b057-b35dfe8f5227 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.752024] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Created folder: Project (96139b5146b84a968b5a8e9c51ada438) in parent group-v449562. [ 520.752201] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Creating folder: Instances. Parent ref: group-v449566. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 520.752429] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6bd1475-b54f-4618-bafb-82666ab22cb6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.762879] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Created folder: Instances in parent group-v449566. [ 520.762879] env[60044]: DEBUG oslo.service.loopingcall [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 520.762879] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 520.762879] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8df34549-b3c2-4cf9-9c21-bb86b4ca257a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.785913] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 520.785913] env[60044]: value = "task-2204681" [ 520.785913] env[60044]: _type = "Task" [ 520.785913] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 520.797468] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204681, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 520.957342] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 520.957632] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 520.957857] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 520.957996] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 520.958447] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 520.958710] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c454186-d994-4d47-bfa6-47d684156017 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.966459] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 520.966629] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 520.967414] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7a75a1-2789-46ef-94c2-1b3e2c49622c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.977315] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ef52ec5-5622-46d1-8943-a45e1ad98ea4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 520.982455] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for the task: (returnval){ [ 520.982455] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5293c836-f0d8-291f-f094-8fff5e5ace5c" [ 520.982455] env[60044]: _type = "Task" [ 520.982455] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 520.993499] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5293c836-f0d8-291f-f094-8fff5e5ace5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 521.297714] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204681, 'name': CreateVM_Task, 'duration_secs': 0.333733} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 521.297898] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 521.416582] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 521.416582] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 521.416582] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 521.416582] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8971267f-c8e2-484b-9283-402e479b56e4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.428610] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Waiting for the task: (returnval){ [ 521.428610] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]524508ae-334a-af69-69cb-49a32331f21b" [ 521.428610] env[60044]: _type = "Task" [ 521.428610] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 521.446339] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 521.446339] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 521.446339] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 521.498524] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 521.498524] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating directory with path [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 521.498524] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b2a3c4a5-88bb-4d0c-ab8a-664c38ffde91 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.509771] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Created directory with path [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 521.509905] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Fetch image to [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 521.510023] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 521.510811] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91534d91-b8ec-4216-a519-c5ad6a3e9ba6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.518778] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4c03670-8549-427e-be8c-06349c56de0a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.529979] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d96b59cb-c149-4e64-ae9a-7d52891fca59 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.568876] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36d31893-5a0e-4462-bcea-66c71ab0c6b1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.577725] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ae54ec49-c07a-4f9a-947b-004bd24b21f8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 521.676672] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 521.754224] env[60044]: DEBUG oslo_vmware.rw_handles [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 521.831716] env[60044]: DEBUG oslo_vmware.rw_handles [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 521.832017] env[60044]: DEBUG oslo_vmware.rw_handles [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 522.748492] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.748492] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.761595] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 522.774610] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Successfully created port: e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 522.832899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.833186] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.836169] env[60044]: INFO nova.compute.claims [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 523.158267] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2128571-a6e4-48f4-8e7b-ea399b581d92 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.166896] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bed27d4-100c-4970-b8ae-46b0d009a518 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.204376] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8f7fc78-5e48-45a6-a591-1da49322b9e9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.213628] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38616141-3771-43c3-94a1-458ff984dc61 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.232300] env[60044]: DEBUG nova.compute.provider_tree [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 523.246951] env[60044]: DEBUG nova.scheduler.client.report [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 523.266650] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.433s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 523.267013] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 523.322925] env[60044]: DEBUG nova.compute.utils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 523.322925] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 523.323040] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 523.341167] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 523.444604] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 523.475215] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 523.475453] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 523.475605] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 523.475785] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 523.475927] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 523.476189] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 523.476475] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 523.476627] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 523.476823] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 523.476943] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 523.477174] env[60044]: DEBUG nova.virt.hardware [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 523.478759] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa2a6d81-c0d2-40d5-8283-2285f83fa0a3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.487159] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffeaa22c-0a9d-454f-bd36-4d64431eb4d7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 523.689615] env[60044]: DEBUG nova.policy [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b786da2369eb45ab916b9e137d644dc8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb70c075cb2e4c44917d5ba6cb849786', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 524.036041] env[60044]: DEBUG nova.compute.manager [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Received event network-changed-26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 524.036414] env[60044]: DEBUG nova.compute.manager [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Refreshing instance network info cache due to event network-changed-26a18662-8fba-4d77-a530-f366d7c04bd8. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 524.036483] env[60044]: DEBUG oslo_concurrency.lockutils [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] Acquiring lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 524.036671] env[60044]: DEBUG oslo_concurrency.lockutils [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] Acquired lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 524.036760] env[60044]: DEBUG nova.network.neutron [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Refreshing network info cache for port 26a18662-8fba-4d77-a530-f366d7c04bd8 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 524.294902] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Successfully updated port: e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 524.312253] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 524.312402] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquired lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 524.312557] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 524.379957] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Successfully updated port: 96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 524.390022] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 524.390171] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 524.392242] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 524.442410] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 524.604025] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 524.650386] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Successfully updated port: 95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 524.672450] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 524.672985] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 524.672985] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 524.976099] env[60044]: DEBUG nova.compute.manager [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Received event network-vif-plugged-e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 524.976303] env[60044]: DEBUG oslo_concurrency.lockutils [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] Acquiring lock "db1dd823-8349-4f34-9a8e-ecec90bd105b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 524.976523] env[60044]: DEBUG oslo_concurrency.lockutils [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] Lock "db1dd823-8349-4f34-9a8e-ecec90bd105b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 524.976682] env[60044]: DEBUG oslo_concurrency.lockutils [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] Lock "db1dd823-8349-4f34-9a8e-ecec90bd105b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 524.976838] env[60044]: DEBUG nova.compute.manager [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] No waiting events found dispatching network-vif-plugged-e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 524.976993] env[60044]: WARNING nova.compute.manager [req-410f10c9-0905-44f7-9f28-c885c16de38d req-58b92f32-1f82-4675-823c-adf04b2bba28 service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Received unexpected event network-vif-plugged-e5151993-a08c-45a0-81c4-df2e7c6d3ad6 for instance with vm_state building and task_state spawning. [ 525.027527] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.028252] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.028452] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 525.028570] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 525.075192] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.075425] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.075643] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.075643] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.075753] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.076233] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.076455] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 525.076615] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 525.077171] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.077421] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.077825] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.077825] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.077960] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.078161] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.078324] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 525.078714] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 525.093908] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 525.101655] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 525.101655] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 525.102028] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 525.102028] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 525.104696] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-154a256c-3ff3-49e5-84e9-1cd338968911 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.115811] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ce5809-4d8e-41c7-8092-30ee8b65146e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.134168] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5327a2e4-5f5a-4dbe-ad1b-0b954944d756 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.146188] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ee71a6-f683-40f5-8a8c-e21b1f238f37 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.182140] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181268MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 525.182140] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 525.182140] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 525.290689] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.290844] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance db1dd823-8349-4f34-9a8e-ecec90bd105b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291035] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ebc60b43-dc9e-4f3c-81c7-f65fe50be628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291377] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4e62d785-7c74-4d3a-9446-e690822d5386 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291524] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 23984fc7-95de-43c3-a21e-894fab241dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291641] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance a93c0169-490e-4cd2-b890-5e1d8aecae59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291758] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 525.291944] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 525.292087] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 525.433779] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8155835-ec86-41a2-bcaf-6b7c3fbd5ece {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.445642] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead40f21-4567-45b0-8633-53f201161c33 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.490658] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117194f7-66ff-4fdd-b491-7371bc68d86a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.498971] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e94875d-9c5b-4089-9100-43bfa87ccd76 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 525.518098] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 525.535637] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 525.555746] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 525.555935] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.375s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 526.018148] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Successfully created port: 2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 526.313267] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Updating instance_info_cache with network_info: [{"id": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "address": "fa:16:3e:a0:63:de", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5151993-a0", "ovs_interfaceid": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 526.325611] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Releasing lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 526.326380] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance network_info: |[{"id": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "address": "fa:16:3e:a0:63:de", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5151993-a0", "ovs_interfaceid": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 526.327224] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:63:de', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e5151993-a08c-45a0-81c4-df2e7c6d3ad6', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 526.341585] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Creating folder: Project (a8331e6e26314ee3bd30dd7f6494daf4). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.341585] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ddabd46-3ceb-4985-9bd2-e519bd60b544 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.351258] env[60044]: DEBUG nova.network.neutron [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Updated VIF entry in instance network info cache for port 26a18662-8fba-4d77-a530-f366d7c04bd8. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 526.351258] env[60044]: DEBUG nova.network.neutron [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Updating instance_info_cache with network_info: [{"id": "26a18662-8fba-4d77-a530-f366d7c04bd8", "address": "fa:16:3e:d9:8b:b8", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap26a18662-8f", "ovs_interfaceid": "26a18662-8fba-4d77-a530-f366d7c04bd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 526.359660] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Created folder: Project (a8331e6e26314ee3bd30dd7f6494daf4) in parent group-v449562. [ 526.359660] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Creating folder: Instances. Parent ref: group-v449569. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.359660] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-251d7d28-39ee-4df0-b24d-0657a6d0e558 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.362902] env[60044]: DEBUG oslo_concurrency.lockutils [req-e9761242-247f-434c-af39-6a56f484180e req-a9f05abb-cdef-4a93-86c8-809e0f1551ad service nova] Releasing lock "refresh_cache-43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 526.370988] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Created folder: Instances in parent group-v449569. [ 526.371516] env[60044]: DEBUG oslo.service.loopingcall [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 526.371778] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 526.372049] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e083babb-cab0-4a87-9afa-9240b984fd12 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.388665] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Updating instance_info_cache with network_info: [{"id": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "address": "fa:16:3e:44:d8:6c", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96b0e278-5e", "ovs_interfaceid": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 526.395648] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 526.395648] env[60044]: value = "task-2204684" [ 526.395648] env[60044]: _type = "Task" [ 526.395648] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 526.400514] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 526.401306] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance network_info: |[{"id": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "address": "fa:16:3e:44:d8:6c", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96b0e278-5e", "ovs_interfaceid": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 526.404929] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:44:d8:6c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 526.414350] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating folder: Project (7913858bdbbe4375917c0e1864ee8d2e). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.414648] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204684, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 526.415160] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-65427fb2-c506-40f1-98eb-ea389b4bf432 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.425435] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created folder: Project (7913858bdbbe4375917c0e1864ee8d2e) in parent group-v449562. [ 526.425637] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating folder: Instances. Parent ref: group-v449572. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.425877] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-887e73c3-c91c-472a-8687-832407c4a041 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.437989] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created folder: Instances in parent group-v449572. [ 526.438300] env[60044]: DEBUG oslo.service.loopingcall [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 526.438456] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 526.438653] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-828f6eb6-df31-45b9-9ef9-be031243eb54 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.461021] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 526.461021] env[60044]: value = "task-2204687" [ 526.461021] env[60044]: _type = "Task" [ 526.461021] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 526.473567] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204687, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 526.591224] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Updating instance_info_cache with network_info: [{"id": "95b54886-0bbe-4351-8124-6f37519af668", "address": "fa:16:3e:b4:0e:da", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.109", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95b54886-0b", "ovs_interfaceid": "95b54886-0bbe-4351-8124-6f37519af668", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 526.602596] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 526.603088] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance network_info: |[{"id": "95b54886-0bbe-4351-8124-6f37519af668", "address": "fa:16:3e:b4:0e:da", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.109", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95b54886-0b", "ovs_interfaceid": "95b54886-0bbe-4351-8124-6f37519af668", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 526.603968] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:0e:da', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '95b54886-0bbe-4351-8124-6f37519af668', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 526.616444] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating folder: Project (a07d0346e8884cf394bb87ea702ec039). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.617111] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56581ec7-fe72-403e-b5e0-d10f4a892134 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.627013] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created folder: Project (a07d0346e8884cf394bb87ea702ec039) in parent group-v449562. [ 526.627465] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating folder: Instances. Parent ref: group-v449575. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 526.627581] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a5612e88-87f2-46ce-975c-cb2cda9d535e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.636220] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created folder: Instances in parent group-v449575. [ 526.636679] env[60044]: DEBUG oslo.service.loopingcall [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 526.636918] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 526.637183] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-64d4988f-0d0d-450b-955a-3075f2261c6e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.659327] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 526.659327] env[60044]: value = "task-2204690" [ 526.659327] env[60044]: _type = "Task" [ 526.659327] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 526.671882] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204690, 'name': CreateVM_Task} progress is 5%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 526.837044] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Successfully updated port: e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 526.848776] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 526.848776] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquired lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 526.849039] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 526.907557] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204684, 'name': CreateVM_Task, 'duration_secs': 0.338619} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 526.908129] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 526.908542] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 526.908816] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 526.908988] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 526.909266] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3859af3b-e55a-459e-9785-fb591ea93a72 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 526.914539] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Waiting for the task: (returnval){ [ 526.914539] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d87736-394d-97e8-0ef3-4821c6ca0da5" [ 526.914539] env[60044]: _type = "Task" [ 526.914539] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 526.926043] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d87736-394d-97e8-0ef3-4821c6ca0da5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 526.963994] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 526.974970] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204687, 'name': CreateVM_Task, 'duration_secs': 0.315086} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 526.975101] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 526.975783] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 527.174507] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204690, 'name': CreateVM_Task, 'duration_secs': 0.300215} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 527.174694] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 527.175333] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 527.427964] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 527.429137] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 527.429137] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 527.429137] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 527.429425] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 527.430111] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3522966a-48c3-4636-a4d1-0f7d87aa79dc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 527.439227] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 527.439227] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a193f1-3990-8665-71ec-c47749cc9323" [ 527.439227] env[60044]: _type = "Task" [ 527.439227] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 527.447289] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a193f1-3990-8665-71ec-c47749cc9323, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 527.866417] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updating instance_info_cache with network_info: [{"id": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "address": "fa:16:3e:e8:9c:5f", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.52", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6b829be-74", "ovs_interfaceid": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 527.889119] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Releasing lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 527.889441] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance network_info: |[{"id": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "address": "fa:16:3e:e8:9c:5f", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.52", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6b829be-74", "ovs_interfaceid": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 527.891346] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:9c:5f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e6b829be-7469-47df-86d8-aaa5c4008cbe', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 527.902494] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Creating folder: Project (1d9184500ed74ad7bee0d5616a6dc843). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 527.903114] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-97917067-8bbf-46a8-8d85-e9677d19409d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 527.913275] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Created folder: Project (1d9184500ed74ad7bee0d5616a6dc843) in parent group-v449562. [ 527.913275] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Creating folder: Instances. Parent ref: group-v449578. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 527.913488] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4c23eb3c-8c10-4ba8-bc0b-e1a60d02c0a5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 527.922688] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Created folder: Instances in parent group-v449578. [ 527.922915] env[60044]: DEBUG oslo.service.loopingcall [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 527.923110] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 527.923307] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ebc8bd27-23e5-4fe6-80ee-297ea54fb111 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 527.958338] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 527.958338] env[60044]: value = "task-2204693" [ 527.958338] env[60044]: _type = "Task" [ 527.958338] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 527.969968] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204693, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 527.973809] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 527.974049] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 527.974256] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 527.974456] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 527.974752] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 527.974994] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c8175ebb-9995-4b86-9685-ecb83a9abff5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 527.984561] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 527.984561] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52dc7b57-63cf-c30e-594c-cdb022582135" [ 527.984561] env[60044]: _type = "Task" [ 527.984561] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 527.993068] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52dc7b57-63cf-c30e-594c-cdb022582135, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 528.474933] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204693, 'name': CreateVM_Task, 'duration_secs': 0.350422} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 528.475233] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 528.475989] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 528.494716] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 528.495048] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 528.495333] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 528.495722] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 528.496119] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 528.496347] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-817b71f6-148d-4833-bbc2-ec4829a078fb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.501333] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for the task: (returnval){ [ 528.501333] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5279a6e3-2bb4-4216-b0e7-87f6fc1f41ea" [ 528.501333] env[60044]: _type = "Task" [ 528.501333] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 528.510681] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5279a6e3-2bb4-4216-b0e7-87f6fc1f41ea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 528.954398] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "426f9016-4e69-4e46-87f6-a67f77da5dff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.954672] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.966383] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 529.016654] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 529.016898] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 529.017117] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 529.039599] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 529.039847] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 529.041382] env[60044]: INFO nova.compute.claims [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 529.184182] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Successfully updated port: 2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 529.198385] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 529.198704] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 529.199025] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 529.254407] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72d33dae-2f18-494e-bfbf-1420054689a1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.263195] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a0b63b5-3d59-4106-a537-8ccd82328bd4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.299082] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59b407b-f6e3-424a-8ccb-c24d407b8876 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.307141] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2db864fa-49db-4cc4-9349-131800624cee {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.321593] env[60044]: DEBUG nova.compute.provider_tree [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 529.332480] env[60044]: DEBUG nova.scheduler.client.report [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 529.349671] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 529.350180] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 529.385705] env[60044]: DEBUG nova.compute.utils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 529.388123] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 529.388354] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 529.398684] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 529.480786] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 529.508626] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 529.508896] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 529.509030] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 529.509207] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 529.509348] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 529.509496] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 529.509717] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 529.509898] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 529.510071] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 529.510231] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 529.510394] env[60044]: DEBUG nova.virt.hardware [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 529.511307] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f30a1e87-0a2a-4257-9a9e-30f2011ea40a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.521929] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8a5fe67-3ff4-4934-9b17-fbe50ca93ffe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.568400] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 529.576655] env[60044]: DEBUG nova.policy [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '09f4cd92a283451d9c10fe5f370ffa48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3499ac5d4a9412e8e0d2db65c79c59c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 530.206064] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updating instance_info_cache with network_info: [{"id": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "address": "fa:16:3e:29:42:33", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2969c86b-c4", "ovs_interfaceid": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 530.222530] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 530.222868] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance network_info: |[{"id": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "address": "fa:16:3e:29:42:33", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2969c86b-c4", "ovs_interfaceid": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 530.224021] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:42:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bd998416-f3d6-4a62-b828-5011063ce76a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2969c86b-c4d7-431e-a7bf-a229e30feccb', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 530.232127] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating folder: Project (eb70c075cb2e4c44917d5ba6cb849786). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.232726] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a07e125a-0871-4863-b874-e3702832c9c9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 530.246590] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created folder: Project (eb70c075cb2e4c44917d5ba6cb849786) in parent group-v449562. [ 530.246590] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating folder: Instances. Parent ref: group-v449581. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.246590] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9491114-dc8a-4fa2-991e-3370bae00315 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 530.254669] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created folder: Instances in parent group-v449581. [ 530.254893] env[60044]: DEBUG oslo.service.loopingcall [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 530.255088] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 530.255651] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-434848e2-8a9a-4502-ae19-6348579fbb70 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 530.277991] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 530.277991] env[60044]: value = "task-2204696" [ 530.277991] env[60044]: _type = "Task" [ 530.277991] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 530.293351] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204696, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 530.334429] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Received event network-vif-plugged-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 530.334429] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 530.334574] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 530.334844] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 530.334844] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] No waiting events found dispatching network-vif-plugged-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 530.335145] env[60044]: WARNING nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Received unexpected event network-vif-plugged-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 for instance with vm_state building and task_state spawning. [ 530.335145] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Received event network-vif-plugged-95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 530.337439] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "4e62d785-7c74-4d3a-9446-e690822d5386-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 530.337439] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "4e62d785-7c74-4d3a-9446-e690822d5386-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 530.337439] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "4e62d785-7c74-4d3a-9446-e690822d5386-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 530.337439] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] No waiting events found dispatching network-vif-plugged-95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 530.337715] env[60044]: WARNING nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Received unexpected event network-vif-plugged-95b54886-0bbe-4351-8124-6f37519af668 for instance with vm_state building and task_state spawning. [ 530.337715] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Received event network-changed-e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 530.337715] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Refreshing instance network info cache due to event network-changed-e5151993-a08c-45a0-81c4-df2e7c6d3ad6. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 530.337715] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 530.337715] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquired lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 530.337863] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Refreshing network info cache for port e5151993-a08c-45a0-81c4-df2e7c6d3ad6 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 530.798685] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204696, 'name': CreateVM_Task, 'duration_secs': 0.321967} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 530.798685] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 530.799282] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 530.799840] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 530.799840] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 530.800061] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a0b169c-f6ab-445e-94e3-913fab478878 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 530.808762] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 530.808762] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52ba8823-1909-fed6-3eb9-3764f9cfc631" [ 530.808762] env[60044]: _type = "Task" [ 530.808762] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 530.816747] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52ba8823-1909-fed6-3eb9-3764f9cfc631, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 531.197122] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Successfully created port: 1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 531.323353] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 531.323610] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 531.323810] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 531.814103] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Updated VIF entry in instance network info cache for port e5151993-a08c-45a0-81c4-df2e7c6d3ad6. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 531.814534] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Updating instance_info_cache with network_info: [{"id": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "address": "fa:16:3e:a0:63:de", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5151993-a0", "ovs_interfaceid": "e5151993-a08c-45a0-81c4-df2e7c6d3ad6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 531.830659] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Releasing lock "refresh_cache-db1dd823-8349-4f34-9a8e-ecec90bd105b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 531.830935] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Received event network-changed-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 531.831113] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Refreshing instance network info cache due to event network-changed-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 531.831318] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 531.831454] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquired lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 531.831639] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Refreshing network info cache for port 96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 532.324218] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.324470] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.334509] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 532.398911] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.399526] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.401045] env[60044]: INFO nova.compute.claims [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 532.620212] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9127f30-6360-490b-a0e0-d0fdcb238736 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.632481] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efce797a-a49e-4e49-a45d-292726688266 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.670548] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-904aaa58-3121-41b1-8f30-e8acc95d7712 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.678229] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3accc37f-926f-4b46-942f-5672d68f1e66 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.695517] env[60044]: DEBUG nova.compute.provider_tree [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 532.714409] env[60044]: DEBUG nova.scheduler.client.report [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 532.735906] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.736448] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 532.780037] env[60044]: DEBUG nova.compute.utils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 532.781990] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Not allocating networking since 'none' was specified. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 532.802702] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 532.806535] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Updated VIF entry in instance network info cache for port 96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 532.806843] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Updating instance_info_cache with network_info: [{"id": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "address": "fa:16:3e:44:d8:6c", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap96b0e278-5e", "ovs_interfaceid": "96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 532.816350] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Releasing lock "refresh_cache-ebc60b43-dc9e-4f3c-81c7-f65fe50be628" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 532.816643] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Received event network-changed-95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 532.816736] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Refreshing instance network info cache due to event network-changed-95b54886-0bbe-4351-8124-6f37519af668. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 532.816929] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 532.817077] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquired lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 532.817231] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Refreshing network info cache for port 95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 532.895173] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 532.925222] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 532.926478] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 532.926478] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 532.926478] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 532.926478] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 532.928628] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 532.928882] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 532.930866] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 532.931079] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 532.931282] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 532.931472] env[60044]: DEBUG nova.virt.hardware [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 532.934065] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-669ba5b3-ceee-4a0a-acbe-0c030a960544 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.946605] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b04922d-dbd8-438e-be57-444e9525bb5d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.962341] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance VIF info [] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 532.968732] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Creating folder: Project (d32baf4d3cbd4e7ba0286e667138fcf2). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 532.968924] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35966f10-8ec3-41d7-b053-9ea992f23b76 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.980280] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Created folder: Project (d32baf4d3cbd4e7ba0286e667138fcf2) in parent group-v449562. [ 532.980467] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Creating folder: Instances. Parent ref: group-v449584. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 532.980758] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-53a30ca4-735b-4229-8f12-1654011a8ab2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.989389] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Created folder: Instances in parent group-v449584. [ 532.989627] env[60044]: DEBUG oslo.service.loopingcall [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 532.989949] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 532.990160] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-338968d5-c8f9-4379-933f-8c031a189987 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.008880] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 533.008880] env[60044]: value = "task-2204699" [ 533.008880] env[60044]: _type = "Task" [ 533.008880] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 533.016700] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204699, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 533.520577] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204699, 'name': CreateVM_Task, 'duration_secs': 0.244681} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 533.521599] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 533.521599] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 533.521599] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 533.522077] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 533.522259] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d262998b-ad21-473d-8214-8a97d637a2d8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.528935] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for the task: (returnval){ [ 533.528935] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d025ac-94d6-fb1a-a116-6c7a051277bf" [ 533.528935] env[60044]: _type = "Task" [ 533.528935] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 533.540562] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d025ac-94d6-fb1a-a116-6c7a051277bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 533.699322] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Updated VIF entry in instance network info cache for port 95b54886-0bbe-4351-8124-6f37519af668. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 533.700022] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Updating instance_info_cache with network_info: [{"id": "95b54886-0bbe-4351-8124-6f37519af668", "address": "fa:16:3e:b4:0e:da", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.109", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap95b54886-0b", "ovs_interfaceid": "95b54886-0bbe-4351-8124-6f37519af668", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 533.716844] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Releasing lock "refresh_cache-4e62d785-7c74-4d3a-9446-e690822d5386" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 533.716844] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Received event network-vif-plugged-e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 533.717066] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "23984fc7-95de-43c3-a21e-894fab241dce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.717201] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "23984fc7-95de-43c3-a21e-894fab241dce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.717356] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Lock "23984fc7-95de-43c3-a21e-894fab241dce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.717511] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] No waiting events found dispatching network-vif-plugged-e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 533.717676] env[60044]: WARNING nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Received unexpected event network-vif-plugged-e6b829be-7469-47df-86d8-aaa5c4008cbe for instance with vm_state building and task_state spawning. [ 533.717834] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Received event network-changed-e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 533.717980] env[60044]: DEBUG nova.compute.manager [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Refreshing instance network info cache due to event network-changed-e6b829be-7469-47df-86d8-aaa5c4008cbe. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 533.718165] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquiring lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 533.718294] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Acquired lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 533.718595] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Refreshing network info cache for port e6b829be-7469-47df-86d8-aaa5c4008cbe {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 533.765216] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "27836d31-f379-4b4b-aed1-155f4a947779" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.765447] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.768217] env[60044]: DEBUG nova.compute.manager [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Received event network-vif-plugged-2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 533.768430] env[60044]: DEBUG oslo_concurrency.lockutils [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] Acquiring lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.768600] env[60044]: DEBUG oslo_concurrency.lockutils [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.768756] env[60044]: DEBUG oslo_concurrency.lockutils [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.768915] env[60044]: DEBUG nova.compute.manager [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] No waiting events found dispatching network-vif-plugged-2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 533.769086] env[60044]: WARNING nova.compute.manager [req-7fe1c5d0-c704-489b-85d5-47e197eb39b0 req-fa9c907b-1b69-41da-9835-d29417616c6f service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Received unexpected event network-vif-plugged-2969c86b-c4d7-431e-a7bf-a229e30feccb for instance with vm_state building and task_state spawning. [ 533.782792] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 533.819219] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "ef011071-c0e1-44e0-9940-285f2f45da67" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.819219] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.856746] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.856746] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.858209] env[60044]: INFO nova.compute.claims [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 534.053343] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 534.053343] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 534.053343] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 534.125941] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e310e7a4-fce6-4bb5-8f1c-2487c07b0eef {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.139496] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3a4221-f961-4cba-b2e8-588f1fedac22 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.181799] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14436853-1bfa-4df9-b5b6-32aaa5bbcb67 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.188934] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22894666-e30a-4e53-8153-e51fef494959 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.206018] env[60044]: DEBUG nova.compute.provider_tree [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 534.215875] env[60044]: DEBUG nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 534.236616] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.380s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 534.237129] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 534.280277] env[60044]: DEBUG nova.compute.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 534.285597] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 534.285597] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 534.298246] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 534.398218] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 534.430016] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 534.430016] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 534.430016] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 534.430314] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 534.430314] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 534.430428] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 534.430620] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 534.430780] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 534.430942] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 534.431505] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 534.431745] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 534.434077] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be46cea-9054-46c0-8bcb-05c67b49a1a6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.444120] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2da55e2b-55d5-4d24-b443-3f69288188bc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.461634] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updated VIF entry in instance network info cache for port e6b829be-7469-47df-86d8-aaa5c4008cbe. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 534.463853] env[60044]: DEBUG nova.network.neutron [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updating instance_info_cache with network_info: [{"id": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "address": "fa:16:3e:e8:9c:5f", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.52", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6b829be-74", "ovs_interfaceid": "e6b829be-7469-47df-86d8-aaa5c4008cbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 534.475963] env[60044]: DEBUG oslo_concurrency.lockutils [req-bd998970-c880-4e83-896c-1fe650ca860e req-7613d99a-0616-453f-a46d-4567d101693a service nova] Releasing lock "refresh_cache-23984fc7-95de-43c3-a21e-894fab241dce" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 534.537135] env[60044]: DEBUG nova.policy [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5714eed2e79f4c12ace82daf1985577f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c70189e619ac48ffaccbeb4f298abbe1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 534.868121] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Successfully updated port: 1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 534.886568] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 534.886707] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquired lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 534.886856] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 535.100717] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 535.415220] env[60044]: DEBUG nova.compute.manager [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Received event network-vif-plugged-1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 535.415744] env[60044]: DEBUG oslo_concurrency.lockutils [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] Acquiring lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 535.415918] env[60044]: DEBUG oslo_concurrency.lockutils [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 535.418087] env[60044]: DEBUG oslo_concurrency.lockutils [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 535.419786] env[60044]: DEBUG nova.compute.manager [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] No waiting events found dispatching network-vif-plugged-1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 535.419786] env[60044]: WARNING nova.compute.manager [req-97d799a6-94b5-4fd2-a3d9-a4302a1560b8 req-2749fbe5-1c18-4941-b52f-2ae785a5e0e7 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Received unexpected event network-vif-plugged-1c372ba3-b2a6-42b9-87fd-28de33e9b059 for instance with vm_state building and task_state spawning. [ 535.733247] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Successfully created port: 68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 535.996612] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updating instance_info_cache with network_info: [{"id": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "address": "fa:16:3e:52:08:26", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c372ba3-b2", "ovs_interfaceid": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 536.019020] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Releasing lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 536.019020] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance network_info: |[{"id": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "address": "fa:16:3e:52:08:26", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c372ba3-b2", "ovs_interfaceid": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 536.019390] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:52:08:26', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1c372ba3-b2a6-42b9-87fd-28de33e9b059', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 536.029364] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Creating folder: Project (d3499ac5d4a9412e8e0d2db65c79c59c). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 536.029364] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2cefb21-422f-449b-88a6-a2a63579ee99 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.046100] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Created folder: Project (d3499ac5d4a9412e8e0d2db65c79c59c) in parent group-v449562. [ 536.046255] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Creating folder: Instances. Parent ref: group-v449590. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 536.046507] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e066703-3f52-4dfb-9448-397c9b6fc778 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.063718] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Created folder: Instances in parent group-v449590. [ 536.063718] env[60044]: DEBUG oslo.service.loopingcall [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 536.063718] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 536.063718] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4aae5794-db76-4883-a2e1-1c34b7c2349e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.087426] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 536.087426] env[60044]: value = "task-2204706" [ 536.087426] env[60044]: _type = "Task" [ 536.087426] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 536.095354] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204706, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 536.603692] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204706, 'name': CreateVM_Task} progress is 99%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 536.624536] env[60044]: DEBUG nova.compute.manager [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Received event network-changed-1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 536.624536] env[60044]: DEBUG nova.compute.manager [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Refreshing instance network info cache due to event network-changed-1c372ba3-b2a6-42b9-87fd-28de33e9b059. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 536.624536] env[60044]: DEBUG oslo_concurrency.lockutils [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] Acquiring lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 536.624536] env[60044]: DEBUG oslo_concurrency.lockutils [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] Acquired lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 536.624536] env[60044]: DEBUG nova.network.neutron [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Refreshing network info cache for port 1c372ba3-b2a6-42b9-87fd-28de33e9b059 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 537.103229] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204706, 'name': CreateVM_Task} progress is 99%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 537.388987] env[60044]: DEBUG nova.network.neutron [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updated VIF entry in instance network info cache for port 1c372ba3-b2a6-42b9-87fd-28de33e9b059. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 537.388987] env[60044]: DEBUG nova.network.neutron [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updating instance_info_cache with network_info: [{"id": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "address": "fa:16:3e:52:08:26", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.61", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c372ba3-b2", "ovs_interfaceid": "1c372ba3-b2a6-42b9-87fd-28de33e9b059", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 537.404601] env[60044]: DEBUG oslo_concurrency.lockutils [req-6de04261-5c3b-4bb5-8ec9-619072654122 req-cd9f6a95-7bb0-4207-b1d7-99b734e734f4 service nova] Releasing lock "refresh_cache-426f9016-4e69-4e46-87f6-a67f77da5dff" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 537.607248] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204706, 'name': CreateVM_Task, 'duration_secs': 1.401666} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 537.607913] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 537.608776] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 537.609248] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 537.610698] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 537.611747] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d26c208b-d503-42b3-9756-a18531b8cd8c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.617082] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for the task: (returnval){ [ 537.617082] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d8dca3-c8e2-224b-c6d1-9b5c64f29b9b" [ 537.617082] env[60044]: _type = "Task" [ 537.617082] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 537.626670] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d8dca3-c8e2-224b-c6d1-9b5c64f29b9b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 537.787319] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Successfully updated port: 68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 537.799293] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 537.799293] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 537.799495] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 537.856873] env[60044]: DEBUG nova.compute.manager [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Received event network-changed-2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 537.857093] env[60044]: DEBUG nova.compute.manager [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Refreshing instance network info cache due to event network-changed-2969c86b-c4d7-431e-a7bf-a229e30feccb. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 537.857322] env[60044]: DEBUG oslo_concurrency.lockutils [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] Acquiring lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 537.857724] env[60044]: DEBUG oslo_concurrency.lockutils [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] Acquired lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 537.857922] env[60044]: DEBUG nova.network.neutron [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Refreshing network info cache for port 2969c86b-c4d7-431e-a7bf-a229e30feccb {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 537.863245] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 538.132225] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 538.132477] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 538.132674] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 538.288134] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updating instance_info_cache with network_info: [{"id": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "address": "fa:16:3e:c1:e1:64", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68f4d2ce-a1", "ovs_interfaceid": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 538.301369] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 538.301773] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance network_info: |[{"id": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "address": "fa:16:3e:c1:e1:64", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68f4d2ce-a1", "ovs_interfaceid": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 538.302098] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:e1:64', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 538.315042] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating folder: Project (c70189e619ac48ffaccbeb4f298abbe1). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 538.315042] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0d9b558b-9f96-4e8a-a78c-b86df207901e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.325355] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created folder: Project (c70189e619ac48ffaccbeb4f298abbe1) in parent group-v449562. [ 538.325665] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating folder: Instances. Parent ref: group-v449593. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 538.325665] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0493dbcc-e71a-4275-b0c9-518fc93f4a8e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.339644] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created folder: Instances in parent group-v449593. [ 538.340811] env[60044]: DEBUG oslo.service.loopingcall [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 538.340811] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 538.340811] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b0fbd550-4306-45a2-81d0-d5ba0eab4d93 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.364476] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 538.364476] env[60044]: value = "task-2204711" [ 538.364476] env[60044]: _type = "Task" [ 538.364476] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 538.375523] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204711, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 538.456289] env[60044]: DEBUG nova.network.neutron [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updated VIF entry in instance network info cache for port 2969c86b-c4d7-431e-a7bf-a229e30feccb. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 538.456412] env[60044]: DEBUG nova.network.neutron [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updating instance_info_cache with network_info: [{"id": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "address": "fa:16:3e:29:42:33", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2969c86b-c4", "ovs_interfaceid": "2969c86b-c4d7-431e-a7bf-a229e30feccb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 538.471195] env[60044]: DEBUG oslo_concurrency.lockutils [req-09182bfa-e0a6-4ba7-bbac-59520071cc85 req-1de18cfe-409f-402f-b278-05ae8d30a1e4 service nova] Releasing lock "refresh_cache-ce718fc3-6f75-49b9-8543-c953646ce0d9" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 538.879527] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204711, 'name': CreateVM_Task} progress is 99%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 539.375621] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204711, 'name': CreateVM_Task, 'duration_secs': 0.569705} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 539.376173] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 539.376937] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 539.377482] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 539.378298] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 539.378739] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ca70d62-66d4-403b-91c5-094a08fba50f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.386960] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 539.386960] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52e6ebd3-30a5-8d70-4104-63bd3d513187" [ 539.386960] env[60044]: _type = "Task" [ 539.386960] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 539.398773] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52e6ebd3-30a5-8d70-4104-63bd3d513187, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 539.777165] env[60044]: DEBUG nova.compute.manager [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Received event network-vif-plugged-68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 539.777165] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Acquiring lock "27836d31-f379-4b4b-aed1-155f4a947779-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 539.777165] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Lock "27836d31-f379-4b4b-aed1-155f4a947779-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 539.777165] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Lock "27836d31-f379-4b4b-aed1-155f4a947779-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 539.777419] env[60044]: DEBUG nova.compute.manager [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] No waiting events found dispatching network-vif-plugged-68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 539.777419] env[60044]: WARNING nova.compute.manager [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Received unexpected event network-vif-plugged-68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb for instance with vm_state building and task_state spawning. [ 539.777419] env[60044]: DEBUG nova.compute.manager [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Received event network-changed-68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 539.777419] env[60044]: DEBUG nova.compute.manager [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Refreshing instance network info cache due to event network-changed-68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 539.777419] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Acquiring lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 539.777559] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Acquired lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 539.777559] env[60044]: DEBUG nova.network.neutron [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Refreshing network info cache for port 68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 539.899675] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 539.899675] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 539.899675] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.482439] env[60044]: DEBUG nova.network.neutron [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updated VIF entry in instance network info cache for port 68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 540.483669] env[60044]: DEBUG nova.network.neutron [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updating instance_info_cache with network_info: [{"id": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "address": "fa:16:3e:c1:e1:64", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap68f4d2ce-a1", "ovs_interfaceid": "68f4d2ce-a1e8-42d3-9c2e-c947ca9e76fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.498580] env[60044]: DEBUG oslo_concurrency.lockutils [req-0ee941d2-8ed2-4e02-a1a8-d18f963f1210 req-b6e758bc-236b-4c02-8d0d-5eb445aa136f service nova] Releasing lock "refresh_cache-27836d31-f379-4b4b-aed1-155f4a947779" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.363948] env[60044]: WARNING oslo_vmware.rw_handles [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 570.363948] env[60044]: ERROR oslo_vmware.rw_handles [ 570.364642] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 570.365837] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 570.366119] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Copying Virtual Disk [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/2f6f7529-d35c-43f8-a466-46dcb52e759c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 570.366476] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0421dc5c-da88-419f-8a4a-64eaa29df425 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.374400] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for the task: (returnval){ [ 570.374400] env[60044]: value = "task-2204723" [ 570.374400] env[60044]: _type = "Task" [ 570.374400] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 570.382167] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Task: {'id': task-2204723, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 570.887775] env[60044]: DEBUG oslo_vmware.exceptions [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 570.888682] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.895435] env[60044]: ERROR nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 570.895435] env[60044]: Faults: ['InvalidArgument'] [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Traceback (most recent call last): [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] yield resources [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self.driver.spawn(context, instance, image_meta, [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self._vmops.spawn(context, instance, image_meta, injected_files, [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self._fetch_image_if_missing(context, vi) [ 570.895435] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] image_cache(vi, tmp_image_ds_loc) [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] vm_util.copy_virtual_disk( [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] session._wait_for_task(vmdk_copy_task) [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return self.wait_for_task(task_ref) [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return evt.wait() [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] result = hub.switch() [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 570.895877] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return self.greenlet.switch() [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self.f(*self.args, **self.kw) [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] raise exceptions.translate_fault(task_info.error) [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Faults: ['InvalidArgument'] [ 570.896374] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] [ 570.896576] env[60044]: INFO nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Terminating instance [ 570.899998] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.899998] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 570.899998] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.899998] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquired lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.900208] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 570.900968] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0d805f0-42f7-4834-80c8-726d36f39ab1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.912074] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 570.912074] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 570.912074] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ea78aa2-973f-4bd4-9225-c1f9e098bc8c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.925936] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Waiting for the task: (returnval){ [ 570.925936] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]526cecbc-5424-8966-1c01-b3a6b6d9554a" [ 570.925936] env[60044]: _type = "Task" [ 570.925936] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 570.936173] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]526cecbc-5424-8966-1c01-b3a6b6d9554a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 570.946163] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 571.078225] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 571.088549] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Releasing lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 571.091158] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 571.091158] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 571.091158] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2574d1a1-6abe-4db8-aa1a-ee35e0f3f45e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.101570] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 571.103500] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-581ca23d-dd54-45c8-85aa-3fecbe0819ac {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.131330] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 571.131560] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 571.131745] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Deleting the datastore file [datastore2] a93c0169-490e-4cd2-b890-5e1d8aecae59 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 571.132008] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-463d523b-e962-491d-89bf-d4ee7e59a381 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.139548] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for the task: (returnval){ [ 571.139548] env[60044]: value = "task-2204725" [ 571.139548] env[60044]: _type = "Task" [ 571.139548] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 571.147677] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Task: {'id': task-2204725, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 571.438963] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 571.439814] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Creating directory with path [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 571.439814] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ab26c01-3e09-4700-8533-26b72865370a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.452505] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Created directory with path [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 571.452633] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Fetch image to [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 571.452814] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 571.454034] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f79ef82-2b08-4bc2-81d7-090da192591e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.464537] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da2285ee-9e18-4ede-a495-230c24dfd6c3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.477939] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48e1ba2c-ba2e-4ca3-98cc-359492036613 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.511862] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0bd8117-c824-46a6-a800-74c0c1b0f29c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.517801] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c9fce07e-65c4-40bd-bce1-836a262a22cb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.605030] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 571.648118] env[60044]: DEBUG oslo_vmware.api [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Task: {'id': task-2204725, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.03251} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 571.649952] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 571.650153] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 571.650349] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 571.651270] env[60044]: INFO nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Took 0.56 seconds to destroy the instance on the hypervisor. [ 571.651270] env[60044]: DEBUG oslo.service.loopingcall [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 571.653510] env[60044]: DEBUG nova.compute.manager [-] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Skipping network deallocation for instance since networking was not requested. {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 571.656722] env[60044]: DEBUG nova.compute.claims [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 571.656722] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 571.656722] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 571.673710] env[60044]: DEBUG oslo_vmware.rw_handles [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 571.739244] env[60044]: DEBUG oslo_vmware.rw_handles [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 571.739411] env[60044]: DEBUG oslo_vmware.rw_handles [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 571.905272] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf94de4-7702-42ab-a750-35aae5fb8b49 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.913962] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf60cbc-d64e-4b66-89d4-f316585dc4a3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.946695] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea8b12c-57d6-4c1d-84e0-cf9c09b7b550 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.954373] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d648a13b-4918-4737-ab58-70c084a79877 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.968409] env[60044]: DEBUG nova.compute.provider_tree [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.980741] env[60044]: DEBUG nova.scheduler.client.report [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.996262] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 571.996833] env[60044]: ERROR nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 571.996833] env[60044]: Faults: ['InvalidArgument'] [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Traceback (most recent call last): [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self.driver.spawn(context, instance, image_meta, [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self._vmops.spawn(context, instance, image_meta, injected_files, [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self._fetch_image_if_missing(context, vi) [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] image_cache(vi, tmp_image_ds_loc) [ 571.996833] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] vm_util.copy_virtual_disk( [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] session._wait_for_task(vmdk_copy_task) [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return self.wait_for_task(task_ref) [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return evt.wait() [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] result = hub.switch() [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] return self.greenlet.switch() [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 571.997273] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] self.f(*self.args, **self.kw) [ 571.997668] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 571.997668] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] raise exceptions.translate_fault(task_info.error) [ 571.997668] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 571.997668] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Faults: ['InvalidArgument'] [ 571.997668] env[60044]: ERROR nova.compute.manager [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] [ 571.997668] env[60044]: DEBUG nova.compute.utils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 572.000651] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Build of instance a93c0169-490e-4cd2-b890-5e1d8aecae59 was re-scheduled: A specified parameter was not correct: fileType [ 572.000651] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 572.001049] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 572.001318] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 572.001468] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquired lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 572.001626] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 572.035055] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 572.096032] env[60044]: DEBUG nova.network.neutron [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 572.107249] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Releasing lock "refresh_cache-a93c0169-490e-4cd2-b890-5e1d8aecae59" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 572.107478] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 572.107659] env[60044]: DEBUG nova.compute.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Skipping network deallocation for instance since networking was not requested. {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 572.214225] env[60044]: INFO nova.scheduler.client.report [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Deleted allocations for instance a93c0169-490e-4cd2-b890-5e1d8aecae59 [ 572.233977] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "a93c0169-490e-4cd2-b890-5e1d8aecae59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.025s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.264071] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 572.316688] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.317113] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.318396] env[60044]: INFO nova.compute.claims [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 572.596652] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da0dac26-6484-4b5d-bc40-1801d3aad37f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.609144] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8dfd64f-98c0-463b-a7d0-45dbff1b8c98 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.643255] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac70c73d-8b1c-4885-ad05-c1606c909da1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.651157] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bccf3242-d15a-48cb-ba62-466c8441ac74 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.664989] env[60044]: DEBUG nova.compute.provider_tree [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 572.673609] env[60044]: DEBUG nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 572.689570] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.690096] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 572.729593] env[60044]: DEBUG nova.compute.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 572.730904] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 572.733802] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 572.741898] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 572.833229] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 572.868254] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 572.868761] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 572.868761] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 572.869139] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 572.869139] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 572.872390] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 572.872390] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 572.872390] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 572.872390] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 572.872390] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 572.872644] env[60044]: DEBUG nova.virt.hardware [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 572.872644] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef03076-d9ad-44cb-9597-d1111e1ae5b9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.883407] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71cb5d7a-d28e-466f-8f18-2e0db8f2f7ef {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.900790] env[60044]: DEBUG nova.policy [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5714eed2e79f4c12ace82daf1985577f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c70189e619ac48ffaccbeb4f298abbe1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 573.384514] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Successfully created port: 2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 574.007692] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Successfully updated port: 2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 574.025068] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.026353] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.026577] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 574.072240] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 574.266234] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updating instance_info_cache with network_info: [{"id": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "address": "fa:16:3e:98:5c:6e", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b8c0a14-4a", "ovs_interfaceid": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.280694] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.280980] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance network_info: |[{"id": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "address": "fa:16:3e:98:5c:6e", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b8c0a14-4a", "ovs_interfaceid": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 574.281388] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:98:5c:6e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2b8c0a14-4aa9-4112-8fa8-aaba318be2e8', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 574.288838] env[60044]: DEBUG oslo.service.loopingcall [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 574.289672] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 574.289893] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-67c2d2fd-ce50-4e82-9ede-7970e48833f7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.310883] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 574.310883] env[60044]: value = "task-2204727" [ 574.310883] env[60044]: _type = "Task" [ 574.310883] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 574.319754] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204727, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 574.828589] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204727, 'name': CreateVM_Task, 'duration_secs': 0.315551} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 574.828724] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 574.829401] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.829569] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.829910] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 574.830183] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d32b42b8-39a2-4022-ac75-9531d600d951 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.835338] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 574.835338] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5264691f-4639-c935-97b3-3d50e806ad75" [ 574.835338] env[60044]: _type = "Task" [ 574.835338] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 574.844349] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5264691f-4639-c935-97b3-3d50e806ad75, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 575.347293] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.347293] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 575.347293] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.526796] env[60044]: DEBUG nova.compute.manager [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Received event network-vif-plugged-2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 575.527072] env[60044]: DEBUG oslo_concurrency.lockutils [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] Acquiring lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.527225] env[60044]: DEBUG oslo_concurrency.lockutils [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] Lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.527386] env[60044]: DEBUG oslo_concurrency.lockutils [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] Lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.527543] env[60044]: DEBUG nova.compute.manager [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] No waiting events found dispatching network-vif-plugged-2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 575.527704] env[60044]: WARNING nova.compute.manager [req-1cc9abf0-3041-4219-8fad-5ae9157da721 req-00d4168d-a56e-4041-9a66-0863ed2ae5c9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Received unexpected event network-vif-plugged-2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 for instance with vm_state building and task_state spawning. [ 578.785681] env[60044]: DEBUG nova.compute.manager [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Received event network-changed-2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 578.785956] env[60044]: DEBUG nova.compute.manager [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Refreshing instance network info cache due to event network-changed-2b8c0a14-4aa9-4112-8fa8-aaba318be2e8. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 578.786196] env[60044]: DEBUG oslo_concurrency.lockutils [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] Acquiring lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 578.786259] env[60044]: DEBUG oslo_concurrency.lockutils [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] Acquired lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 578.786410] env[60044]: DEBUG nova.network.neutron [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Refreshing network info cache for port 2b8c0a14-4aa9-4112-8fa8-aaba318be2e8 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 579.600300] env[60044]: DEBUG nova.network.neutron [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updated VIF entry in instance network info cache for port 2b8c0a14-4aa9-4112-8fa8-aaba318be2e8. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 579.602057] env[60044]: DEBUG nova.network.neutron [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updating instance_info_cache with network_info: [{"id": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "address": "fa:16:3e:98:5c:6e", "network": {"id": "9ffd298b-063f-4470-bb45-b8912bf9ac1c", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1717634151-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c70189e619ac48ffaccbeb4f298abbe1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b8c0a14-4a", "ovs_interfaceid": "2b8c0a14-4aa9-4112-8fa8-aaba318be2e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 579.639854] env[60044]: DEBUG oslo_concurrency.lockutils [req-59559231-b03f-44df-8da4-c4d080b2c57a req-d6ef6bad-41fd-480e-8852-37365036a6d9 service nova] Releasing lock "refresh_cache-ef011071-c0e1-44e0-9940-285f2f45da67" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 585.542538] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.574840] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.575014] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.576183] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.595345] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.595565] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.595730] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.596290] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 585.598463] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1419db82-1655-4b5f-be10-e305bfe66926 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.607235] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae7a3c89-aaf0-4868-b71a-521ce4d28b26 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.624628] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ebc23c6-223f-49a6-86ce-6595a3b0c6d5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.632162] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-664f2948-129c-4c23-a6fc-7abdffd8d3b5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.667804] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181277MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 585.667999] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.668225] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.750730] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.750893] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance db1dd823-8349-4f34-9a8e-ecec90bd105b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751050] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ebc60b43-dc9e-4f3c-81c7-f65fe50be628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751147] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4e62d785-7c74-4d3a-9446-e690822d5386 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751266] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 23984fc7-95de-43c3-a21e-894fab241dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751415] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751553] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751653] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751731] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.751847] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.752037] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 585.752189] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 585.923061] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4101f165-da43-4e98-b5bc-4fe2c53f1508 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.931343] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c55bf667-f9c9-42ec-b2ef-ad2a5e9f00db {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.963566] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a34c7304-b6fb-40a4-b2c4-023c08054728 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.972290] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ca68f4e-3b61-4293-840e-2d2d4f0b8e53 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.988961] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 585.999341] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 586.013579] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 586.013887] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.458431] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 586.458739] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 586.458922] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 586.487805] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.489684] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.489989] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.490251] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.492782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.492782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.492782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.492782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.492782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.493139] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 586.493139] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 586.493139] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 586.493139] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.018498] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.018755] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.018896] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 587.019056] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 596.363769] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "6874067b-8e9b-4242-9a5f-6312f1484a00" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.364394] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "6874067b-8e9b-4242-9a5f-6312f1484a00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.215868] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "f03f507b-364f-41b9-ad33-dcb56ab03317" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.216199] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "f03f507b-364f-41b9-ad33-dcb56ab03317" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.649275] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "ea4a243b-481f-421d-ba29-c88c828f754e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.649700] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "ea4a243b-481f-421d-ba29-c88c828f754e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.720888] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "df997589-61b6-4f68-9169-e6f9bee650c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.721266] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "df997589-61b6-4f68-9169-e6f9bee650c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.505415] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "b62dda0a-da1d-4109-a925-bb32d01da242" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.505641] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "b62dda0a-da1d-4109-a925-bb32d01da242" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.108938] env[60044]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "6604de35-7683-4d5d-ac6f-13752ccb940c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.109255] env[60044]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "6604de35-7683-4d5d-ac6f-13752ccb940c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.527587] env[60044]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "189903f4-37c9-4331-bb23-245ed68ecaae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.527587] env[60044]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "189903f4-37c9-4331-bb23-245ed68ecaae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.311157] env[60044]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "75cc0c18-27d3-4074-897b-08812a11829c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.311528] env[60044]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "75cc0c18-27d3-4074-897b-08812a11829c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.325835] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "22aa54d4-80ec-4d56-9239-41810c469b9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.325835] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "22aa54d4-80ec-4d56-9239-41810c469b9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.757307] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "885fe65d-ee02-4ed7-8d59-109775086038" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.757559] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "885fe65d-ee02-4ed7-8d59-109775086038" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.956227] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "ec879414-4534-4d0e-a65e-65baff80b16e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.956488] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "ec879414-4534-4d0e-a65e-65baff80b16e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.986167] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "cde76b14-ee01-44c8-8004-39cdf91e9889" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.986445] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "cde76b14-ee01-44c8-8004-39cdf91e9889" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.430325] env[60044]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "ca732c56-b1d1-40bf-96b6-4b93bc5ff29d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.430542] env[60044]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "ca732c56-b1d1-40bf-96b6-4b93bc5ff29d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.670770] env[60044]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "6f0c0004-7fd2-49bf-bb1e-48774c481497" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.671046] env[60044]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "6f0c0004-7fd2-49bf-bb1e-48774c481497" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.840330] env[60044]: WARNING oslo_vmware.rw_handles [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 618.840330] env[60044]: ERROR oslo_vmware.rw_handles [ 618.844029] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 618.844029] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 618.844029] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Copying Virtual Disk [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/63cfc58b-388c-4006-950b-14ad9e187081/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 618.844029] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-48e286e0-4ceb-48d2-8729-e6efc68aa806 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.852790] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Waiting for the task: (returnval){ [ 618.852790] env[60044]: value = "task-2204733" [ 618.852790] env[60044]: _type = "Task" [ 618.852790] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 618.861589] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Task: {'id': task-2204733, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.366085] env[60044]: DEBUG oslo_vmware.exceptions [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 619.366085] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.366085] env[60044]: ERROR nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.366085] env[60044]: Faults: ['InvalidArgument'] [ 619.366085] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Traceback (most recent call last): [ 619.366085] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 619.366085] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] yield resources [ 619.366085] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 619.366085] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self.driver.spawn(context, instance, image_meta, [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self._fetch_image_if_missing(context, vi) [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] image_cache(vi, tmp_image_ds_loc) [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] vm_util.copy_virtual_disk( [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] session._wait_for_task(vmdk_copy_task) [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return self.wait_for_task(task_ref) [ 619.366474] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return evt.wait() [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] result = hub.switch() [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return self.greenlet.switch() [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self.f(*self.args, **self.kw) [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] raise exceptions.translate_fault(task_info.error) [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Faults: ['InvalidArgument'] [ 619.366812] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] [ 619.366812] env[60044]: INFO nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Terminating instance [ 619.367125] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.367401] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 619.368129] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 619.368510] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 619.368813] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9a1a7be-1750-4478-b407-e62a4945d843 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.375897] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c1a0306-738e-4326-a645-ecac8c69fae8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.381687] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 619.381884] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 619.384315] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce6225e8-5b72-4c51-836a-536d2e7f6e11 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.386516] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 619.386734] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f321207d-d8ec-4c9e-9f8e-83efbf0f3352 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.390695] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Waiting for the task: (returnval){ [ 619.390695] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]522e9545-7e5a-c440-2914-8663e4a27175" [ 619.390695] env[60044]: _type = "Task" [ 619.390695] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.405632] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 619.407024] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Creating directory with path [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 619.407024] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-355363fa-01d2-47b3-b717-0cba1ecba20f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.425228] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Created directory with path [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 619.425394] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Fetch image to [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 619.425560] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 619.426337] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-952c5928-78d5-44ae-a057-15e159eb405a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.432964] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9566431-d8e0-44e6-9171-759391bdbe1e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.444227] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65b2dbe-d1d2-4be8-90e1-bf8c3f6fed31 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.477285] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c0dffe-76b7-4233-b24a-b0fedfdca7fe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.479905] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 619.481851] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 619.481851] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Deleting the datastore file [datastore2] 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 619.481851] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1cb6e36b-4914-4f9f-b6c9-b95cc11efc5f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.485409] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-885750fe-dae2-4dd9-b0d6-bb99837186c9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.488312] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Waiting for the task: (returnval){ [ 619.488312] env[60044]: value = "task-2204735" [ 619.488312] env[60044]: _type = "Task" [ 619.488312] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.496296] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Task: {'id': task-2204735, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.507513] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 619.568614] env[60044]: DEBUG oslo_vmware.rw_handles [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 619.629994] env[60044]: DEBUG oslo_vmware.rw_handles [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 619.630206] env[60044]: DEBUG oslo_vmware.rw_handles [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 619.999831] env[60044]: DEBUG oslo_vmware.api [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Task: {'id': task-2204735, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065334} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 620.000136] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 620.000352] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 620.000559] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 620.000768] env[60044]: INFO nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Took 0.63 seconds to destroy the instance on the hypervisor. [ 620.003189] env[60044]: DEBUG nova.compute.claims [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 620.003387] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.003630] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.355458] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4daecc2-243d-4177-b4b6-2e218bc3b367 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.370021] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-563a0583-e4d2-482b-b2e7-f15c6f5dc74b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.400346] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69aaff72-5c9e-42fb-b7f8-b6f4e5a71e22 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.408080] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1852f42-868f-410c-98c7-b21e02c16847 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.423690] env[60044]: DEBUG nova.compute.provider_tree [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.434190] env[60044]: DEBUG nova.scheduler.client.report [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.448692] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.445s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.449270] env[60044]: ERROR nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 620.449270] env[60044]: Faults: ['InvalidArgument'] [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Traceback (most recent call last): [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self.driver.spawn(context, instance, image_meta, [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self._fetch_image_if_missing(context, vi) [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] image_cache(vi, tmp_image_ds_loc) [ 620.449270] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] vm_util.copy_virtual_disk( [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] session._wait_for_task(vmdk_copy_task) [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return self.wait_for_task(task_ref) [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return evt.wait() [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] result = hub.switch() [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] return self.greenlet.switch() [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 620.449677] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] self.f(*self.args, **self.kw) [ 620.449998] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 620.449998] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] raise exceptions.translate_fault(task_info.error) [ 620.449998] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 620.449998] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Faults: ['InvalidArgument'] [ 620.449998] env[60044]: ERROR nova.compute.manager [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] [ 620.449998] env[60044]: DEBUG nova.compute.utils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 620.451446] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Build of instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee was re-scheduled: A specified parameter was not correct: fileType [ 620.451446] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 620.451863] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 620.452190] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 620.452426] env[60044]: DEBUG nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 620.452600] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 620.953094] env[60044]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.960861] env[60044]: INFO nova.compute.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Took 0.51 seconds to deallocate network for instance. [ 621.066311] env[60044]: INFO nova.scheduler.client.report [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Deleted allocations for instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee [ 621.089175] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 108.116s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.118365] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 621.176213] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.176213] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.177666] env[60044]: INFO nova.compute.claims [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 621.544430] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-644944c8-af88-4cc4-9441-87269796fe53 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.553275] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35ed61a-9307-4185-a5b4-519ef2c4992c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.583991] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3460a557-c2f0-4fe4-85cf-a0dd94ca4d56 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.591903] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36fa7fbd-7827-4b9b-80e4-21223f0a1818 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.607226] env[60044]: DEBUG nova.compute.provider_tree [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.615533] env[60044]: DEBUG nova.scheduler.client.report [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.635245] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.459s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.635747] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 621.675025] env[60044]: DEBUG nova.compute.utils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 621.677436] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 621.677613] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 621.692381] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 621.765384] env[60044]: DEBUG nova.policy [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a9e33dfaaf3a44f4b19c904d7f7d5be2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '750bcd4b13bb4da9937e127e5abc1201', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 621.795034] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 621.830618] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 621.830953] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 621.831131] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 621.831399] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 621.831580] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 621.831726] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 621.832100] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 621.832334] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.832558] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.832765] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.832984] env[60044]: DEBUG nova.virt.hardware [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.834237] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c84f346d-9586-4740-8892-c1bbdfa589ef {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.845730] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1709ebd8-6f41-49ae-9ba0-35ee2e72305e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.185430] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Successfully created port: c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 622.855745] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Successfully updated port: c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 622.869359] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.869359] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquired lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 622.869359] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 622.919881] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 623.111872] env[60044]: DEBUG nova.compute.manager [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Received event network-vif-plugged-c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 623.111872] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Acquiring lock "6874067b-8e9b-4242-9a5f-6312f1484a00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.111872] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Lock "6874067b-8e9b-4242-9a5f-6312f1484a00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.111872] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Lock "6874067b-8e9b-4242-9a5f-6312f1484a00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.112778] env[60044]: DEBUG nova.compute.manager [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] No waiting events found dispatching network-vif-plugged-c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 623.112778] env[60044]: WARNING nova.compute.manager [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Received unexpected event network-vif-plugged-c0bedb65-8124-42c4-bfdb-81d886ea053a for instance with vm_state building and task_state spawning. [ 623.112778] env[60044]: DEBUG nova.compute.manager [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Received event network-changed-c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 623.112778] env[60044]: DEBUG nova.compute.manager [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Refreshing instance network info cache due to event network-changed-c0bedb65-8124-42c4-bfdb-81d886ea053a. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 623.114494] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Acquiring lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 623.180231] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Updating instance_info_cache with network_info: [{"id": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "address": "fa:16:3e:99:c8:68", "network": {"id": "13cf098f-c6d5-4d37-94f2-ebb536194130", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-563206987-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "750bcd4b13bb4da9937e127e5abc1201", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc0bedb65-81", "ovs_interfaceid": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.198598] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Releasing lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.199284] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance network_info: |[{"id": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "address": "fa:16:3e:99:c8:68", "network": {"id": "13cf098f-c6d5-4d37-94f2-ebb536194130", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-563206987-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "750bcd4b13bb4da9937e127e5abc1201", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc0bedb65-81", "ovs_interfaceid": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 623.200312] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Acquired lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 623.200794] env[60044]: DEBUG nova.network.neutron [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Refreshing network info cache for port c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 623.201976] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:c8:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2a75bb6e-6331-4429-b1b9-c968cc22b9c9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c0bedb65-8124-42c4-bfdb-81d886ea053a', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 623.212048] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Creating folder: Project (750bcd4b13bb4da9937e127e5abc1201). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.213795] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-61dee8d0-7167-4056-8690-b8934cee3137 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.230244] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Created folder: Project (750bcd4b13bb4da9937e127e5abc1201) in parent group-v449562. [ 623.230434] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Creating folder: Instances. Parent ref: group-v449602. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.230659] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5f0c0e8-363a-4b95-8411-f4b09289bcc3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.246489] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Created folder: Instances in parent group-v449602. [ 623.247234] env[60044]: DEBUG oslo.service.loopingcall [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 623.247234] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 623.247234] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fe93d791-81b9-4a07-83c3-bca29b517e36 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.270302] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 623.270302] env[60044]: value = "task-2204738" [ 623.270302] env[60044]: _type = "Task" [ 623.270302] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 623.280655] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204738, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 623.504224] env[60044]: DEBUG nova.network.neutron [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Updated VIF entry in instance network info cache for port c0bedb65-8124-42c4-bfdb-81d886ea053a. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 623.504606] env[60044]: DEBUG nova.network.neutron [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Updating instance_info_cache with network_info: [{"id": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "address": "fa:16:3e:99:c8:68", "network": {"id": "13cf098f-c6d5-4d37-94f2-ebb536194130", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-563206987-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "750bcd4b13bb4da9937e127e5abc1201", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc0bedb65-81", "ovs_interfaceid": "c0bedb65-8124-42c4-bfdb-81d886ea053a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.518899] env[60044]: DEBUG oslo_concurrency.lockutils [req-fb745b9f-88f6-4ec2-93ad-2e2866fbd399 req-8a7ec5ad-a459-407f-a261-7bf99056d163 service nova] Releasing lock "refresh_cache-6874067b-8e9b-4242-9a5f-6312f1484a00" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.780640] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204738, 'name': CreateVM_Task, 'duration_secs': 0.326182} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 623.780806] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 623.781498] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 623.781649] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 623.782008] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 623.782258] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7993b883-4c32-4795-863f-a5d2ddc7c45e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.786941] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Waiting for the task: (returnval){ [ 623.786941] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52bc169a-ead7-e102-53b2-f4ba13f30002" [ 623.786941] env[60044]: _type = "Task" [ 623.786941] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 623.795124] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52bc169a-ead7-e102-53b2-f4ba13f30002, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 624.297631] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 624.298036] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 624.298167] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.614618] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.614900] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.021558] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 645.031522] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.031736] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.031899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 645.032068] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 645.034865] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a180dadd-c1ef-4315-b33a-7b4787860c37 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.043342] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5318ab72-43db-440a-8059-413aa7c80cc5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.057018] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cd609d4-5356-44f2-8949-c5a0838a0d0a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.063576] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47111116-d4e4-4648-b1cd-ef84590724d6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.091434] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181199MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 645.091552] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.091647] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.158215] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance db1dd823-8349-4f34-9a8e-ecec90bd105b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.158970] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ebc60b43-dc9e-4f3c-81c7-f65fe50be628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.158970] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4e62d785-7c74-4d3a-9446-e690822d5386 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.158970] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 23984fc7-95de-43c3-a21e-894fab241dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.158970] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.159162] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.159162] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.159162] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.159255] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.159367] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 645.184156] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f03f507b-364f-41b9-ad33-dcb56ab03317 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.210578] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ea4a243b-481f-421d-ba29-c88c828f754e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.220091] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance df997589-61b6-4f68-9169-e6f9bee650c7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.229244] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance b62dda0a-da1d-4109-a925-bb32d01da242 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.238094] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6604de35-7683-4d5d-ac6f-13752ccb940c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.246741] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 189903f4-37c9-4331-bb23-245ed68ecaae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.256096] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 75cc0c18-27d3-4074-897b-08812a11829c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.264777] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 22aa54d4-80ec-4d56-9239-41810c469b9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.274266] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 885fe65d-ee02-4ed7-8d59-109775086038 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.282880] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ec879414-4534-4d0e-a65e-65baff80b16e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.291968] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance cde76b14-ee01-44c8-8004-39cdf91e9889 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.301883] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ca732c56-b1d1-40bf-96b6-4b93bc5ff29d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.310689] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6f0c0004-7fd2-49bf-bb1e-48774c481497 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.319992] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 645.319992] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 645.319992] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 645.574551] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ad9f1c-eb1e-4e71-88a2-a996cd378a86 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.581922] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42460c30-4f90-495a-b215-85c31b140198 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.611012] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7814f5e2-4487-4cbc-b183-7e4bf8fe0ffc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.617825] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99c2919a-f2d0-4b3d-a0f6-8c62ecb6d23d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 645.631038] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 645.638201] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 645.651121] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 645.651294] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.649690] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.650025] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 647.650025] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 647.669961] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670135] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670273] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670397] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670519] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670638] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670756] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670875] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.670990] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.671123] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 647.671241] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 647.671668] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.671840] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.671988] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.672180] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 648.019527] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 648.019598] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 648.019802] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 649.014784] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 665.390897] env[60044]: WARNING oslo_vmware.rw_handles [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 665.390897] env[60044]: ERROR oslo_vmware.rw_handles [ 665.391573] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 665.392966] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 665.393231] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Copying Virtual Disk [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/eac59d03-73bf-45fc-a462-25f4c8e8dc3d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 665.393553] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f3cda02a-119a-4cf2-9718-476a6489981c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.401102] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Waiting for the task: (returnval){ [ 665.401102] env[60044]: value = "task-2204739" [ 665.401102] env[60044]: _type = "Task" [ 665.401102] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.409039] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Task: {'id': task-2204739, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 665.910857] env[60044]: DEBUG oslo_vmware.exceptions [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 665.911169] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 665.911729] env[60044]: ERROR nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.911729] env[60044]: Faults: ['InvalidArgument'] [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Traceback (most recent call last): [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] yield resources [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self.driver.spawn(context, instance, image_meta, [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self._fetch_image_if_missing(context, vi) [ 665.911729] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] image_cache(vi, tmp_image_ds_loc) [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] vm_util.copy_virtual_disk( [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] session._wait_for_task(vmdk_copy_task) [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return self.wait_for_task(task_ref) [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return evt.wait() [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] result = hub.switch() [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 665.912102] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return self.greenlet.switch() [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self.f(*self.args, **self.kw) [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] raise exceptions.translate_fault(task_info.error) [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Faults: ['InvalidArgument'] [ 665.912503] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] [ 665.912503] env[60044]: INFO nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Terminating instance [ 665.913775] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 665.913974] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 665.914825] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d65ebb6-8d85-4735-98c4-3e85a1649657 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.916461] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 665.917819] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 665.917819] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82758430-d727-4b44-b65a-c58ca9dfc187 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.924268] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 665.924498] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f53ed4a3-30d3-49c8-a7c5-aaccfedc73d6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.927667] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 665.927922] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 665.928887] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f5bac1d-99f1-48b4-98dd-125fdffa457f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.933826] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 665.933826] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]522ed0b7-d013-39e2-53ef-665421c32a58" [ 665.933826] env[60044]: _type = "Task" [ 665.933826] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.942927] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]522ed0b7-d013-39e2-53ef-665421c32a58, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 666.000836] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 666.001095] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 666.001256] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Deleting the datastore file [datastore2] db1dd823-8349-4f34-9a8e-ecec90bd105b {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 666.001520] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8cc32681-9d54-4550-9f8e-69252d9bc99a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.008528] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Waiting for the task: (returnval){ [ 666.008528] env[60044]: value = "task-2204741" [ 666.008528] env[60044]: _type = "Task" [ 666.008528] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 666.016646] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Task: {'id': task-2204741, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 666.444389] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 666.444717] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating directory with path [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 666.444879] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-237e1d95-cba8-4bcd-a7d3-88ad63e129a2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.456285] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created directory with path [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 666.456285] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Fetch image to [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 666.456443] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 666.457213] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8797e4b-3755-4b5f-ac7b-47b98f6921ca {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.463898] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb38cb0-d139-497c-8032-5e1e7cf03544 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.472713] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea5a0087-3ae4-4a4e-a289-cf10811c27e6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.503922] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46d90cc7-f765-40e4-bc0d-f0b2b5f18c3d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.512321] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e5beea36-d40a-4606-a746-7b0e13da0992 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.518446] env[60044]: DEBUG oslo_vmware.api [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Task: {'id': task-2204741, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063137} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 666.518671] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 666.518848] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 666.519025] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 666.519207] env[60044]: INFO nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 666.521215] env[60044]: DEBUG nova.compute.claims [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 666.521378] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.521603] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.534403] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 666.579146] env[60044]: DEBUG oslo_vmware.rw_handles [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 666.636399] env[60044]: DEBUG oslo_vmware.rw_handles [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 666.636582] env[60044]: DEBUG oslo_vmware.rw_handles [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 666.860075] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1371823-5ebd-4b75-928c-3020b200ae0c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.868308] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d72c9c-7bda-4805-bcab-7e65e1ecf3b4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.898196] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b575df4f-52a9-4dee-95bd-155e557c0b5f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.905110] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e672dca-0067-4bc7-80b0-16dd7a19c98d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.918081] env[60044]: DEBUG nova.compute.provider_tree [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.926528] env[60044]: DEBUG nova.scheduler.client.report [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.939391] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.418s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.939944] env[60044]: ERROR nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.939944] env[60044]: Faults: ['InvalidArgument'] [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Traceback (most recent call last): [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self.driver.spawn(context, instance, image_meta, [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self._fetch_image_if_missing(context, vi) [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] image_cache(vi, tmp_image_ds_loc) [ 666.939944] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] vm_util.copy_virtual_disk( [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] session._wait_for_task(vmdk_copy_task) [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return self.wait_for_task(task_ref) [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return evt.wait() [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] result = hub.switch() [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] return self.greenlet.switch() [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 666.940348] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] self.f(*self.args, **self.kw) [ 666.940716] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 666.940716] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] raise exceptions.translate_fault(task_info.error) [ 666.940716] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.940716] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Faults: ['InvalidArgument'] [ 666.940716] env[60044]: ERROR nova.compute.manager [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] [ 666.940716] env[60044]: DEBUG nova.compute.utils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 666.942182] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Build of instance db1dd823-8349-4f34-9a8e-ecec90bd105b was re-scheduled: A specified parameter was not correct: fileType [ 666.942182] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 666.942553] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 666.942836] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 666.942911] env[60044]: DEBUG nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 666.943084] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 667.209433] env[60044]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.219914] env[60044]: INFO nova.compute.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Took 0.28 seconds to deallocate network for instance. [ 667.302683] env[60044]: INFO nova.scheduler.client.report [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Deleted allocations for instance db1dd823-8349-4f34-9a8e-ecec90bd105b [ 667.322650] env[60044]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "db1dd823-8349-4f34-9a8e-ecec90bd105b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 152.174s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.335931] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 667.383447] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.383550] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.385159] env[60044]: INFO nova.compute.claims [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 667.702256] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e143ae1e-bbe0-4184-9981-7c87d726b0eb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.709602] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63a7cef1-7519-430f-ad32-4782c4049d78 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.738423] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6deaf56c-37c1-4078-91c9-4090ac1949c4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.745457] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe1f1f8-b79c-4d31-a3bc-37fd49eb4d53 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.757973] env[60044]: DEBUG nova.compute.provider_tree [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 667.765935] env[60044]: DEBUG nova.scheduler.client.report [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 667.778514] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.395s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.792972] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "da8c6e50-058d-4636-9b5b-e55cf7fe7946" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.793235] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "da8c6e50-058d-4636-9b5b-e55cf7fe7946" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.798295] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "da8c6e50-058d-4636-9b5b-e55cf7fe7946" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.005s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.798808] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 667.828784] env[60044]: DEBUG nova.compute.utils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 667.830052] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 667.833019] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 667.843099] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 667.885197] env[60044]: DEBUG nova.policy [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '599f90008649481b950c0d7600639837', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e13da93f325a4e68ad89ac46dfcb196b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 667.906813] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 667.928932] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 667.929189] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 667.929346] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 667.929571] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 667.929660] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 667.929801] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 667.930008] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 667.930172] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 667.930445] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 667.930599] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 667.930800] env[60044]: DEBUG nova.virt.hardware [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 667.931700] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f22d952-15f2-4094-b63d-37d5dc16302c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.939696] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24872ed6-9319-4a48-95d9-f942f5500a95 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.232284] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Successfully created port: ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 668.867265] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Successfully updated port: ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 668.877516] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 668.877516] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquired lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 668.877516] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 668.925283] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.328204] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Updating instance_info_cache with network_info: [{"id": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "address": "fa:16:3e:85:0b:53", "network": {"id": "f8c3cd4b-dc45-4e1f-b7d0-965bb905f645", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-231414391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e13da93f325a4e68ad89ac46dfcb196b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce27f8fc-b7", "ovs_interfaceid": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.346244] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Releasing lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 669.346531] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance network_info: |[{"id": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "address": "fa:16:3e:85:0b:53", "network": {"id": "f8c3cd4b-dc45-4e1f-b7d0-965bb905f645", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-231414391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e13da93f325a4e68ad89ac46dfcb196b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce27f8fc-b7", "ovs_interfaceid": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 669.346893] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:85:0b:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5f60c972-a72d-4c5f-a250-faadfd6eafbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ce27f8fc-b7cd-4215-9123-9f7bd3df4f42', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 669.357827] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Creating folder: Project (e13da93f325a4e68ad89ac46dfcb196b). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.359337] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dbfc1460-8df1-4446-abd9-606142fce24b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.362634] env[60044]: DEBUG nova.compute.manager [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Received event network-vif-plugged-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 669.362874] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Acquiring lock "f03f507b-364f-41b9-ad33-dcb56ab03317-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.363148] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Lock "f03f507b-364f-41b9-ad33-dcb56ab03317-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.363261] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Lock "f03f507b-364f-41b9-ad33-dcb56ab03317-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 669.363448] env[60044]: DEBUG nova.compute.manager [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] No waiting events found dispatching network-vif-plugged-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 669.363662] env[60044]: WARNING nova.compute.manager [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Received unexpected event network-vif-plugged-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 for instance with vm_state building and task_state spawning. [ 669.363772] env[60044]: DEBUG nova.compute.manager [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Received event network-changed-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 669.363918] env[60044]: DEBUG nova.compute.manager [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Refreshing instance network info cache due to event network-changed-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 669.364108] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Acquiring lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 669.364247] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Acquired lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 669.364456] env[60044]: DEBUG nova.network.neutron [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Refreshing network info cache for port ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 669.377582] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Created folder: Project (e13da93f325a4e68ad89ac46dfcb196b) in parent group-v449562. [ 669.377771] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Creating folder: Instances. Parent ref: group-v449605. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.378302] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7253e345-04af-4e9e-827d-808f4fa8d606 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.388769] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Created folder: Instances in parent group-v449605. [ 669.389022] env[60044]: DEBUG oslo.service.loopingcall [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 669.389212] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 669.389408] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8e047ba1-59d1-4e09-8f90-ca685a86f679 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.410165] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 669.410165] env[60044]: value = "task-2204744" [ 669.410165] env[60044]: _type = "Task" [ 669.410165] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 669.418650] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204744, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 669.826307] env[60044]: DEBUG nova.network.neutron [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Updated VIF entry in instance network info cache for port ce27f8fc-b7cd-4215-9123-9f7bd3df4f42. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 669.826830] env[60044]: DEBUG nova.network.neutron [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Updating instance_info_cache with network_info: [{"id": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "address": "fa:16:3e:85:0b:53", "network": {"id": "f8c3cd4b-dc45-4e1f-b7d0-965bb905f645", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-231414391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e13da93f325a4e68ad89ac46dfcb196b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce27f8fc-b7", "ovs_interfaceid": "ce27f8fc-b7cd-4215-9123-9f7bd3df4f42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.838255] env[60044]: DEBUG oslo_concurrency.lockutils [req-a38596c4-0bee-4b99-abc9-5f858f94edb8 req-b4812f50-e35c-4fe4-adf8-8ef7e5de16e1 service nova] Releasing lock "refresh_cache-f03f507b-364f-41b9-ad33-dcb56ab03317" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 669.923168] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204744, 'name': CreateVM_Task, 'duration_secs': 0.342736} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 669.923576] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 669.924049] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 669.924219] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 669.924818] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 669.925084] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f982c6ca-1382-4086-8870-8d3e70beb094 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.930546] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Waiting for the task: (returnval){ [ 669.930546] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52323ca6-ef2b-c663-90a5-a444fcc5ac8b" [ 669.930546] env[60044]: _type = "Task" [ 669.930546] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 669.944870] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52323ca6-ef2b-c663-90a5-a444fcc5ac8b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 670.445623] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 670.445623] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 670.445623] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.014087] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 707.018469] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 707.018731] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 707.019515] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 707.029865] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.029865] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.029865] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.029865] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 707.030364] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87f21630-d98f-458f-b6b9-a119a20c0001 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.038807] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfa82758-89bd-4ec3-8d4d-60b6eb9049a1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.053375] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-567106e9-a016-40dd-bb26-51ff2fd2efab {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.059724] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fa0aff3-fb08-4632-8b74-1c3aa2a1a439 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.090975] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181261MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 707.091149] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.091335] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.156076] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ebc60b43-dc9e-4f3c-81c7-f65fe50be628 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156261] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4e62d785-7c74-4d3a-9446-e690822d5386 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156394] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 23984fc7-95de-43c3-a21e-894fab241dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156516] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156677] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156759] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.156865] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.157026] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.157120] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.157231] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f03f507b-364f-41b9-ad33-dcb56ab03317 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 707.169631] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ea4a243b-481f-421d-ba29-c88c828f754e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.180481] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance df997589-61b6-4f68-9169-e6f9bee650c7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.190409] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance b62dda0a-da1d-4109-a925-bb32d01da242 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.201780] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6604de35-7683-4d5d-ac6f-13752ccb940c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.212463] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 189903f4-37c9-4331-bb23-245ed68ecaae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.222642] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 75cc0c18-27d3-4074-897b-08812a11829c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.232112] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 22aa54d4-80ec-4d56-9239-41810c469b9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.240697] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 885fe65d-ee02-4ed7-8d59-109775086038 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.249270] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ec879414-4534-4d0e-a65e-65baff80b16e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.258244] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance cde76b14-ee01-44c8-8004-39cdf91e9889 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.266862] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ca732c56-b1d1-40bf-96b6-4b93bc5ff29d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.275324] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6f0c0004-7fd2-49bf-bb1e-48774c481497 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.284328] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 707.284544] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 707.284690] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 707.530683] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ab4f09b-966c-46a8-acde-ac6a414fc6bf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.538484] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6139c4c5-bf30-4488-b2bd-760bd8c222a1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.568466] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2285d380-dac9-45e4-9122-10d798b46dc5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.575467] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-092fb6ea-678c-456e-bb7b-9af2cb74aba4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.588375] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.596541] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.610954] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 707.611232] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.611546] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.611890] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 708.611890] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 708.632616] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.632863] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633019] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633157] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633286] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633410] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633530] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633651] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633767] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.633887] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 708.634015] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 708.634488] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.634668] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.634818] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.634952] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 709.037786] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 710.019106] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 713.345692] env[60044]: WARNING oslo_vmware.rw_handles [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 713.345692] env[60044]: ERROR oslo_vmware.rw_handles [ 713.346323] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 713.347890] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 713.348186] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Copying Virtual Disk [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/7de7bb1a-d382-43b9-a166-8bccf1ab21e8/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 713.348483] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f83dd0ad-9099-4d52-a460-79d08341bf2d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.355774] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 713.355774] env[60044]: value = "task-2204745" [ 713.355774] env[60044]: _type = "Task" [ 713.355774] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 713.363959] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204745, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 713.874955] env[60044]: DEBUG oslo_vmware.exceptions [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 713.875219] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 713.875848] env[60044]: ERROR nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 713.875848] env[60044]: Faults: ['InvalidArgument'] [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Traceback (most recent call last): [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] yield resources [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] self.driver.spawn(context, instance, image_meta, [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] self._fetch_image_if_missing(context, vi) [ 713.875848] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] image_cache(vi, tmp_image_ds_loc) [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] vm_util.copy_virtual_disk( [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] session._wait_for_task(vmdk_copy_task) [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] return self.wait_for_task(task_ref) [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] return evt.wait() [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] result = hub.switch() [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.876561] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] return self.greenlet.switch() [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] self.f(*self.args, **self.kw) [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] raise exceptions.translate_fault(task_info.error) [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Faults: ['InvalidArgument'] [ 713.877157] env[60044]: ERROR nova.compute.manager [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] [ 713.877157] env[60044]: INFO nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Terminating instance [ 713.880993] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 713.881478] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 713.881890] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.882434] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 713.883541] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93ec1213-0825-4981-be79-2a1bfb2d1231 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.888696] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9be182a-05a5-4e18-aaaf-634f5d6644fc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.896868] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 713.897137] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9bf6a977-6704-498e-8309-9cb90cd49f75 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.899353] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 713.899549] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 713.900449] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c1a210c3-b84b-4ece-a2f0-fc2775c92bf2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.905412] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 713.905412] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52165086-ebd2-4175-52e5-054ec088d853" [ 713.905412] env[60044]: _type = "Task" [ 713.905412] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 713.912567] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52165086-ebd2-4175-52e5-054ec088d853, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 713.963602] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 713.963810] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 713.963983] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Deleting the datastore file [datastore2] ebc60b43-dc9e-4f3c-81c7-f65fe50be628 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 713.964283] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bc8f9582-b5f5-4ef7-bb12-6895ddf2657f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.970389] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 713.970389] env[60044]: value = "task-2204747" [ 713.970389] env[60044]: _type = "Task" [ 713.970389] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 713.977953] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204747, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 714.176681] env[60044]: DEBUG nova.compute.manager [req-bb4b1300-7705-4c61-8c8b-c79586fd21d8 req-16625e35-a09a-4c4d-a45b-baca70146063 service nova] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Received event network-vif-deleted-96b0e278-5e0e-49e0-b8b5-d0ad705f8ea2 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 714.415256] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 714.415602] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating directory with path [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 714.415673] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e58b3f13-ae3a-4e37-8f0f-2118e89dc7bc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.427439] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created directory with path [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 714.427631] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Fetch image to [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 714.427814] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 714.428558] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78dcae7f-c7c2-4a55-a2c8-7f4759399bdf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.436265] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37f0d2f9-a112-4c28-9757-2518194c9c50 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.444742] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee1d1c1-643f-42e5-bd0f-0dc541f50119 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.478074] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b2fb533-a37e-41e4-8f3b-faed5d1d18bd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.483990] env[60044]: DEBUG oslo_vmware.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204747, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077735} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 714.485351] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 714.485542] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 714.485710] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 714.485879] env[60044]: INFO nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Took 0.60 seconds to destroy the instance on the hypervisor. [ 714.487579] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-33bea70e-b6c6-4b79-ae6e-a869ad0fb830 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.490115] env[60044]: DEBUG nova.compute.claims [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 714.490301] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.490511] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.509297] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 714.518696] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.519419] env[60044]: DEBUG nova.compute.utils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance ebc60b43-dc9e-4f3c-81c7-f65fe50be628 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 714.526214] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 714.526214] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 714.526214] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 714.526214] env[60044]: DEBUG nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 714.526374] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 714.556059] env[60044]: DEBUG nova.network.neutron [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.565705] env[60044]: INFO nova.compute.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Took 0.04 seconds to deallocate network for instance. [ 714.572266] env[60044]: DEBUG oslo_vmware.rw_handles [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 714.639044] env[60044]: DEBUG oslo_vmware.rw_handles [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 714.639044] env[60044]: DEBUG oslo_vmware.rw_handles [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 714.665338] env[60044]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "ebc60b43-dc9e-4f3c-81c7-f65fe50be628" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.894s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.681535] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 714.733753] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.733753] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.734978] env[60044]: INFO nova.compute.claims [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 715.060349] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c0a2ec-a0ff-487d-9296-ac75cc364096 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.068330] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526ad548-5f1f-41b8-a9d1-40f3182f9575 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.098012] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e09cff94-57e6-43b3-9745-1001f79d5174 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.105116] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0331e5f8-a9dc-4c48-8b17-10d82d5c4e0a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.117826] env[60044]: DEBUG nova.compute.provider_tree [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.125961] env[60044]: DEBUG nova.scheduler.client.report [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.139020] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.139518] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 715.175140] env[60044]: DEBUG nova.compute.utils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 715.175140] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 715.175594] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 715.183933] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 715.236476] env[60044]: DEBUG nova.policy [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '76461c030e8e4c168de2b2924851a433', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '75310bd38faf4daea1ed2e141769a330', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 715.256780] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 715.278128] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 715.278432] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 715.278613] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 715.278851] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 715.279042] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 715.279237] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 715.279464] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 715.279670] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 715.279868] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 715.280074] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 715.280259] env[60044]: DEBUG nova.virt.hardware [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 715.281142] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dee51711-2c2f-4c76-aa47-1dd42ebb0a1b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.289546] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b9ae596-42a1-4b24-9cf6-444c475e0ef3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.517074] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Successfully created port: 25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 715.749585] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e84f3fe9-d377-4018-8874-972d1f888208" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.749807] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e84f3fe9-d377-4018-8874-972d1f888208" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.773944] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "0d87148b-1493-4777-a8b3-b94a64e8eca6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.774309] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "0d87148b-1493-4777-a8b3-b94a64e8eca6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.002282] env[60044]: DEBUG nova.compute.manager [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Received event network-vif-plugged-25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 716.002505] env[60044]: DEBUG oslo_concurrency.lockutils [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] Acquiring lock "ea4a243b-481f-421d-ba29-c88c828f754e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.002706] env[60044]: DEBUG oslo_concurrency.lockutils [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] Lock "ea4a243b-481f-421d-ba29-c88c828f754e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.003034] env[60044]: DEBUG oslo_concurrency.lockutils [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] Lock "ea4a243b-481f-421d-ba29-c88c828f754e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.003295] env[60044]: DEBUG nova.compute.manager [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] No waiting events found dispatching network-vif-plugged-25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 716.003483] env[60044]: WARNING nova.compute.manager [req-35b0cc2d-b75d-4e5f-92ed-2275fe9b6b24 req-cefcb787-afb1-4f38-9369-a21674ac4ffb service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Received unexpected event network-vif-plugged-25593bac-ef49-4577-9496-d7f10b170036 for instance with vm_state building and task_state spawning. [ 716.075249] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Successfully updated port: 25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 716.091320] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.091489] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquired lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.091658] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.129728] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.243638] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "23984fc7-95de-43c3-a21e-894fab241dce" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.275613] env[60044]: DEBUG nova.compute.manager [req-a5753312-e494-49de-8276-8b48be7f93c1 req-ddf9d532-17e2-417e-93b0-40a3ed7847ff service nova] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Received event network-vif-deleted-95b54886-0bbe-4351-8124-6f37519af668 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 716.292578] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Updating instance_info_cache with network_info: [{"id": "25593bac-ef49-4577-9496-d7f10b170036", "address": "fa:16:3e:09:41:3b", "network": {"id": "a79ad833-5bf2-4736-b86c-3b1a3ce5d862", "bridge": "br-int", "label": "tempest-ServersTestJSON-485546304-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75310bd38faf4daea1ed2e141769a330", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5116f690-f825-4fee-8a47-42b073e716c5", "external-id": "nsx-vlan-transportzone-692", "segmentation_id": 692, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25593bac-ef", "ovs_interfaceid": "25593bac-ef49-4577-9496-d7f10b170036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.306850] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Releasing lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.307142] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance network_info: |[{"id": "25593bac-ef49-4577-9496-d7f10b170036", "address": "fa:16:3e:09:41:3b", "network": {"id": "a79ad833-5bf2-4736-b86c-3b1a3ce5d862", "bridge": "br-int", "label": "tempest-ServersTestJSON-485546304-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75310bd38faf4daea1ed2e141769a330", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5116f690-f825-4fee-8a47-42b073e716c5", "external-id": "nsx-vlan-transportzone-692", "segmentation_id": 692, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25593bac-ef", "ovs_interfaceid": "25593bac-ef49-4577-9496-d7f10b170036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 716.307694] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:09:41:3b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5116f690-f825-4fee-8a47-42b073e716c5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '25593bac-ef49-4577-9496-d7f10b170036', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 716.315644] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Creating folder: Project (75310bd38faf4daea1ed2e141769a330). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 716.316101] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9febdb5-5b14-4cd8-ac68-389d3a594686 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.326744] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Created folder: Project (75310bd38faf4daea1ed2e141769a330) in parent group-v449562. [ 716.326902] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Creating folder: Instances. Parent ref: group-v449608. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 716.327127] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8f8817b7-b992-4822-8aca-1b97e2254631 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.334585] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Created folder: Instances in parent group-v449608. [ 716.334832] env[60044]: DEBUG oslo.service.loopingcall [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.334992] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 716.335198] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a5e97351-2c43-4747-bebc-083c2f933e66 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.352934] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 716.352934] env[60044]: value = "task-2204750" [ 716.352934] env[60044]: _type = "Task" [ 716.352934] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 716.360187] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204750, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 716.862436] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204750, 'name': CreateVM_Task, 'duration_secs': 0.302881} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 716.862692] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 716.863392] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.863557] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.863875] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 716.864127] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e6af5be0-b588-4578-9d6e-895e6f355c98 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.868672] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Waiting for the task: (returnval){ [ 716.868672] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]521ac743-78d8-d515-63c3-94f14f3efe9c" [ 716.868672] env[60044]: _type = "Task" [ 716.868672] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 716.877514] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]521ac743-78d8-d515-63c3-94f14f3efe9c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.379033] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.379033] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 717.379033] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.043957] env[60044]: DEBUG nova.compute.manager [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Received event network-changed-25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 718.044199] env[60044]: DEBUG nova.compute.manager [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Refreshing instance network info cache due to event network-changed-25593bac-ef49-4577-9496-d7f10b170036. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 718.044361] env[60044]: DEBUG oslo_concurrency.lockutils [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] Acquiring lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.044504] env[60044]: DEBUG oslo_concurrency.lockutils [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] Acquired lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.044661] env[60044]: DEBUG nova.network.neutron [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Refreshing network info cache for port 25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 718.450083] env[60044]: DEBUG nova.network.neutron [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Updated VIF entry in instance network info cache for port 25593bac-ef49-4577-9496-d7f10b170036. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 718.450502] env[60044]: DEBUG nova.network.neutron [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Updating instance_info_cache with network_info: [{"id": "25593bac-ef49-4577-9496-d7f10b170036", "address": "fa:16:3e:09:41:3b", "network": {"id": "a79ad833-5bf2-4736-b86c-3b1a3ce5d862", "bridge": "br-int", "label": "tempest-ServersTestJSON-485546304-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75310bd38faf4daea1ed2e141769a330", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5116f690-f825-4fee-8a47-42b073e716c5", "external-id": "nsx-vlan-transportzone-692", "segmentation_id": 692, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25593bac-ef", "ovs_interfaceid": "25593bac-ef49-4577-9496-d7f10b170036", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.460500] env[60044]: DEBUG oslo_concurrency.lockutils [req-a215d5f4-142a-4d61-8449-1eee86b13339 req-5fe19273-bb9b-435f-b9c0-a02fb7144d6c service nova] Releasing lock "refresh_cache-ea4a243b-481f-421d-ba29-c88c828f754e" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.671141] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.812803] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "426f9016-4e69-4e46-87f6-a67f77da5dff" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.946944] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "27836d31-f379-4b4b-aed1-155f4a947779" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.043348] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "ef011071-c0e1-44e0-9940-285f2f45da67" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.068559] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.421740] env[60044]: WARNING oslo_vmware.rw_handles [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 760.421740] env[60044]: ERROR oslo_vmware.rw_handles [ 760.422433] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 760.424233] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 760.424490] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Copying Virtual Disk [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/a01f8ecb-37b6-46bd-ba2a-a77ea917ed73/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 760.424787] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-332dd1ba-98a6-4d8e-9e8f-d39f5fa4dcdc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.437603] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 760.437603] env[60044]: value = "task-2204751" [ 760.437603] env[60044]: _type = "Task" [ 760.437603] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.445660] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204751, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 760.947596] env[60044]: DEBUG oslo_vmware.exceptions [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 760.947895] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.948463] env[60044]: ERROR nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.948463] env[60044]: Faults: ['InvalidArgument'] [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Traceback (most recent call last): [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] yield resources [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] self.driver.spawn(context, instance, image_meta, [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] self._fetch_image_if_missing(context, vi) [ 760.948463] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] image_cache(vi, tmp_image_ds_loc) [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] vm_util.copy_virtual_disk( [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] session._wait_for_task(vmdk_copy_task) [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] return self.wait_for_task(task_ref) [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] return evt.wait() [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] result = hub.switch() [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.948920] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] return self.greenlet.switch() [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] self.f(*self.args, **self.kw) [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] raise exceptions.translate_fault(task_info.error) [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Faults: ['InvalidArgument'] [ 760.949290] env[60044]: ERROR nova.compute.manager [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] [ 760.949290] env[60044]: INFO nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Terminating instance [ 760.950367] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.950574] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 760.950812] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e8304dd-7a0f-4b18-aec0-105cda4e9cf0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.953393] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.953583] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.954348] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5effa33-6167-4d6c-9b1b-fd893cda3142 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.961643] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 760.962678] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3405719b-24e2-4ae3-9f0e-1687e60904ad {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.964200] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 760.964356] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 760.965007] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31e8a81f-7213-4cf2-a677-58fbf29409da {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.970203] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for the task: (returnval){ [ 760.970203] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52872de4-f1d1-c5e9-2548-d50af4c47c45" [ 760.970203] env[60044]: _type = "Task" [ 760.970203] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.977344] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52872de4-f1d1-c5e9-2548-d50af4c47c45, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 761.032897] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 761.033126] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 761.033380] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Deleting the datastore file [datastore2] 4e62d785-7c74-4d3a-9446-e690822d5386 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 761.033643] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7932a0e-fa56-4822-a036-51ea4cb1a30b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.042845] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 761.042845] env[60044]: value = "task-2204753" [ 761.042845] env[60044]: _type = "Task" [ 761.042845] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 761.050311] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204753, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 761.480986] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 761.481271] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Creating directory with path [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 761.481494] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98aaf6ce-0114-41ff-8775-ecef536d0d0d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.493127] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Created directory with path [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 761.493423] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Fetch image to [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 761.493620] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 761.494362] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-359395b7-8281-4702-b21a-6536573ad0a2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.501067] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c70c6294-43a7-49d4-8ada-47d4d175d994 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.510177] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d594e2-46ab-49f9-95ee-f9f0e6879a4f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.540856] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb8b1e2e-aa75-48e2-98e7-45d1162119c8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.551285] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a0a9b6a-8d35-41c1-a074-f97d3592234f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.552929] env[60044]: DEBUG oslo_vmware.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204753, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076265} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 761.553171] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 761.553376] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 761.553548] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 761.553716] env[60044]: INFO nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Took 0.60 seconds to destroy the instance on the hypervisor. [ 761.556154] env[60044]: DEBUG nova.compute.claims [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 761.556327] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.556536] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.581038] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.581373] env[60044]: DEBUG nova.compute.utils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance 4e62d785-7c74-4d3a-9446-e690822d5386 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 761.584238] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 761.586343] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 761.586514] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 761.586675] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 761.586823] env[60044]: DEBUG nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.587169] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 761.614212] env[60044]: DEBUG nova.network.neutron [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.622902] env[60044]: INFO nova.compute.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Took 0.04 seconds to deallocate network for instance. [ 761.630100] env[60044]: DEBUG oslo_vmware.rw_handles [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 761.687348] env[60044]: DEBUG oslo_vmware.rw_handles [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 761.687488] env[60044]: DEBUG oslo_vmware.rw_handles [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 761.707625] env[60044]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "4e62d785-7c74-4d3a-9446-e690822d5386" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.931s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.719140] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 761.763605] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.763837] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.765209] env[60044]: INFO nova.compute.claims [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 762.085422] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c37127-a116-4ec3-8114-c003e6958720 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.093880] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24705fd8-2763-4b6e-a53b-fbe339a75fb7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.126296] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26210311-4e31-4716-8dc9-e0863b936783 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.134612] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0568903c-7014-4763-b85a-26f3f21b183a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.148464] env[60044]: DEBUG nova.compute.provider_tree [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.161244] env[60044]: DEBUG nova.scheduler.client.report [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.173915] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.410s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.174577] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 762.211043] env[60044]: DEBUG nova.compute.utils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 762.211362] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 762.211665] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 762.223210] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 762.275116] env[60044]: DEBUG nova.policy [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '885501b7d394413b86aad917534c4eed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6caa6881daf74a08b946fafd73ae022e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 762.290460] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 762.313528] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 762.313797] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 762.313953] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 762.314154] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 762.314337] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 762.314493] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 762.314701] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 762.314857] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 762.315198] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 762.315434] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 762.315732] env[60044]: DEBUG nova.virt.hardware [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 762.316592] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-899f73b9-6230-4df9-8bc1-0a65ab70ad0c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.324361] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77047360-cea2-4a49-b135-414c1196a02b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.557423] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Successfully created port: d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 763.065724] env[60044]: DEBUG nova.compute.manager [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Received event network-vif-plugged-d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 763.065867] env[60044]: DEBUG oslo_concurrency.lockutils [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] Acquiring lock "df997589-61b6-4f68-9169-e6f9bee650c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.066085] env[60044]: DEBUG oslo_concurrency.lockutils [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] Lock "df997589-61b6-4f68-9169-e6f9bee650c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.066287] env[60044]: DEBUG oslo_concurrency.lockutils [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] Lock "df997589-61b6-4f68-9169-e6f9bee650c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.066409] env[60044]: DEBUG nova.compute.manager [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] No waiting events found dispatching network-vif-plugged-d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 763.066568] env[60044]: WARNING nova.compute.manager [req-74642065-d9f3-4341-8622-b24b29611f8c req-0404f810-9d53-41bf-b4b7-dd752f6f31c0 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Received unexpected event network-vif-plugged-d09fce1c-f0bd-4383-aaff-15568df1b8a9 for instance with vm_state building and task_state spawning. [ 763.087731] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Successfully updated port: d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 763.101660] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 763.101808] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquired lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 763.101954] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 763.140341] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.296779] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Updating instance_info_cache with network_info: [{"id": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "address": "fa:16:3e:56:32:03", "network": {"id": "bfc712e8-02e3-40e9-882a-693323525f4c", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1050433140-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6caa6881daf74a08b946fafd73ae022e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "76f377cd-5966-49b4-9210-907f592c694e", "external-id": "nsx-vlan-transportzone-124", "segmentation_id": 124, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd09fce1c-f0", "ovs_interfaceid": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.307811] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Releasing lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 763.308230] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance network_info: |[{"id": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "address": "fa:16:3e:56:32:03", "network": {"id": "bfc712e8-02e3-40e9-882a-693323525f4c", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1050433140-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6caa6881daf74a08b946fafd73ae022e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "76f377cd-5966-49b4-9210-907f592c694e", "external-id": "nsx-vlan-transportzone-124", "segmentation_id": 124, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd09fce1c-f0", "ovs_interfaceid": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 763.308641] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:56:32:03', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '76f377cd-5966-49b4-9210-907f592c694e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd09fce1c-f0bd-4383-aaff-15568df1b8a9', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 763.316263] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Creating folder: Project (6caa6881daf74a08b946fafd73ae022e). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 763.316754] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57ea3885-71ce-40dd-b692-b8aab7b499ee {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.328411] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Created folder: Project (6caa6881daf74a08b946fafd73ae022e) in parent group-v449562. [ 763.328648] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Creating folder: Instances. Parent ref: group-v449611. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 763.328898] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-07b9de74-c5c7-437f-9904-a4cd31161e7d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.337241] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Created folder: Instances in parent group-v449611. [ 763.337660] env[60044]: DEBUG oslo.service.loopingcall [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 763.337879] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 763.338108] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-318651ac-2875-40f4-b2be-f8f573dc53f7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.356587] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 763.356587] env[60044]: value = "task-2204756" [ 763.356587] env[60044]: _type = "Task" [ 763.356587] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 763.363886] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204756, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 763.865382] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204756, 'name': CreateVM_Task, 'duration_secs': 0.295735} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 763.868039] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 763.868039] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 763.868039] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 763.868039] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 763.868039] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d08c6293-0457-4986-b590-9c251da1ec8f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.871348] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Waiting for the task: (returnval){ [ 763.871348] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52f82271-e778-96aa-f7fe-cbec493a9ace" [ 763.871348] env[60044]: _type = "Task" [ 763.871348] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 763.878679] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52f82271-e778-96aa-f7fe-cbec493a9ace, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 764.382265] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 764.382512] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 764.382715] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.019028] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 765.019253] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Cleaning up deleted instances {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 765.035898] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] There are 2 instances to clean {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 765.036203] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 765.072900] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 765.090097] env[60044]: DEBUG nova.compute.manager [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Received event network-changed-d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 765.090308] env[60044]: DEBUG nova.compute.manager [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Refreshing instance network info cache due to event network-changed-d09fce1c-f0bd-4383-aaff-15568df1b8a9. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 765.091101] env[60044]: DEBUG oslo_concurrency.lockutils [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] Acquiring lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.091101] env[60044]: DEBUG oslo_concurrency.lockutils [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] Acquired lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.091101] env[60044]: DEBUG nova.network.neutron [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Refreshing network info cache for port d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 765.108699] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 765.108699] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Cleaning up deleted instances with incomplete migration {{(pid=60044) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 765.117548] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 765.326974] env[60044]: DEBUG nova.network.neutron [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Updated VIF entry in instance network info cache for port d09fce1c-f0bd-4383-aaff-15568df1b8a9. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 765.327369] env[60044]: DEBUG nova.network.neutron [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Updating instance_info_cache with network_info: [{"id": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "address": "fa:16:3e:56:32:03", "network": {"id": "bfc712e8-02e3-40e9-882a-693323525f4c", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1050433140-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6caa6881daf74a08b946fafd73ae022e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "76f377cd-5966-49b4-9210-907f592c694e", "external-id": "nsx-vlan-transportzone-124", "segmentation_id": 124, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd09fce1c-f0", "ovs_interfaceid": "d09fce1c-f0bd-4383-aaff-15568df1b8a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.337114] env[60044]: DEBUG oslo_concurrency.lockutils [req-6139ea68-af2a-4803-a8ef-7e194975c17d req-b2f1ff58-562c-46a8-b722-f2d125ef6a53 service nova] Releasing lock "refresh_cache-df997589-61b6-4f68-9169-e6f9bee650c7" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 767.122845] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 767.123179] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.014627] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.018504] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.018663] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 769.018856] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 769.041468] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.041659] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.041756] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.041863] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.041994] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.042453] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.042598] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.042723] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.042845] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.042965] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 769.043097] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 769.043627] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.043750] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.043903] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.060337] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.060552] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.060715] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.060868] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 769.063779] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00231228-e0b7-412f-b4db-2aad9e538b6d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.072185] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-166bab44-8864-4aaa-b00e-35d973dcdeb6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.085843] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98471d79-e449-4802-8e09-595ec4b23c53 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.092221] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3e97bf-7037-44b5-be07-f4aec72ee169 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.121919] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181269MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 769.122089] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.122316] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.246208] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 23984fc7-95de-43c3-a21e-894fab241dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246374] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246506] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246628] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246746] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246863] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.246977] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.247103] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f03f507b-364f-41b9-ad33-dcb56ab03317 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.247216] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ea4a243b-481f-421d-ba29-c88c828f754e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.247327] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance df997589-61b6-4f68-9169-e6f9bee650c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 769.258768] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance b62dda0a-da1d-4109-a925-bb32d01da242 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.269259] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6604de35-7683-4d5d-ac6f-13752ccb940c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.278900] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 189903f4-37c9-4331-bb23-245ed68ecaae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.288486] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 75cc0c18-27d3-4074-897b-08812a11829c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.297753] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 22aa54d4-80ec-4d56-9239-41810c469b9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.307769] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 885fe65d-ee02-4ed7-8d59-109775086038 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.317411] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ec879414-4534-4d0e-a65e-65baff80b16e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.327391] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance cde76b14-ee01-44c8-8004-39cdf91e9889 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.338554] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ca732c56-b1d1-40bf-96b6-4b93bc5ff29d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.348572] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 6f0c0004-7fd2-49bf-bb1e-48774c481497 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.359035] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.370310] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance e84f3fe9-d377-4018-8874-972d1f888208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.379574] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 769.379802] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 769.379960] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 769.395835] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing inventories for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 769.409658] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Updating ProviderTree inventory for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 769.409848] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Updating inventory in ProviderTree for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 769.420642] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing aggregate associations for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca, aggregates: None {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 769.436610] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing trait associations for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 769.690797] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6838b6d-759d-4c51-9889-7c969bfeedda {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.698310] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16fe34db-e9aa-4f4b-9905-29047c156ea1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.727432] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb367b6-c16a-46c2-8b89-7bfb67652467 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.734176] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc90cf70-bf86-4422-b7a0-a87fe031d48a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.746660] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 769.756009] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 769.771821] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 769.771821] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.648s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.746218] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 770.746454] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 772.020551] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 793.742111] env[60044]: DEBUG nova.compute.manager [req-0f75ebbd-e781-4ea5-8137-7c8c4e562514 req-74874ba0-d988-4ee0-b19a-4985fdf962cb service nova] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Received event network-vif-deleted-c0bedb65-8124-42c4-bfdb-81d886ea053a {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 802.211669] env[60044]: DEBUG nova.compute.manager [req-58736c43-6ebb-4ddf-89ac-13a4565311c1 req-7c9ed560-b97a-4c44-835a-5bbeddf26550 service nova] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Received event network-vif-deleted-ce27f8fc-b7cd-4215-9123-9f7bd3df4f42 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 804.673034] env[60044]: DEBUG nova.compute.manager [req-d0c17c4e-d883-434f-a525-c749e7b98c4b req-187eba0f-f49a-4997-920b-e4c7dcf5e972 service nova] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Received event network-vif-deleted-25593bac-ef49-4577-9496-d7f10b170036 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 804.673304] env[60044]: DEBUG nova.compute.manager [req-d0c17c4e-d883-434f-a525-c749e7b98c4b req-187eba0f-f49a-4997-920b-e4c7dcf5e972 service nova] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Received event network-vif-deleted-d09fce1c-f0bd-4383-aaff-15568df1b8a9 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.459705] env[60044]: WARNING oslo_vmware.rw_handles [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 809.459705] env[60044]: ERROR oslo_vmware.rw_handles [ 809.460589] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 809.462263] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 809.462263] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Copying Virtual Disk [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/7df4ff59-814c-4a14-92de-a515b3c0ca87/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 809.462428] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e8a744c6-e045-4872-a71b-4dd2406bc9c2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.472139] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for the task: (returnval){ [ 809.472139] env[60044]: value = "task-2204757" [ 809.472139] env[60044]: _type = "Task" [ 809.472139] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 809.481537] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Task: {'id': task-2204757, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.983435] env[60044]: DEBUG oslo_vmware.exceptions [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 809.983709] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 809.984298] env[60044]: ERROR nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.984298] env[60044]: Faults: ['InvalidArgument'] [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Traceback (most recent call last): [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] yield resources [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self.driver.spawn(context, instance, image_meta, [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self._fetch_image_if_missing(context, vi) [ 809.984298] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] image_cache(vi, tmp_image_ds_loc) [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] vm_util.copy_virtual_disk( [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] session._wait_for_task(vmdk_copy_task) [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return self.wait_for_task(task_ref) [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return evt.wait() [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] result = hub.switch() [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 809.984728] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return self.greenlet.switch() [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self.f(*self.args, **self.kw) [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] raise exceptions.translate_fault(task_info.error) [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Faults: ['InvalidArgument'] [ 809.985124] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] [ 809.985124] env[60044]: INFO nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Terminating instance [ 809.992024] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 809.992024] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 809.992024] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.992024] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 809.992484] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e293fe38-6adf-4505-b3d4-bf85e615bba4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.998368] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9eef05fa-5a57-4d01-974a-274efff4921c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.006225] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 810.008021] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cd070616-b5de-4226-b3d4-5765df8fd504 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.009889] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 810.010267] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 810.010982] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf8b40f6-f198-43af-a0cd-6dd6a859ce01 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.018501] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 810.018501] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5275fb6b-e586-72cd-a72f-00e320bf49c2" [ 810.018501] env[60044]: _type = "Task" [ 810.018501] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 810.026427] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5275fb6b-e586-72cd-a72f-00e320bf49c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 810.085021] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 810.085021] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 810.085021] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Deleting the datastore file [datastore2] 23984fc7-95de-43c3-a21e-894fab241dce {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 810.085021] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d5b0b007-c5ae-43e0-aa9d-4a61c6eaacd8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.091333] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for the task: (returnval){ [ 810.091333] env[60044]: value = "task-2204759" [ 810.091333] env[60044]: _type = "Task" [ 810.091333] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 810.100415] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Task: {'id': task-2204759, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 810.529556] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 810.529899] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating directory with path [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 810.530145] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15734921-050f-4762-a7c3-9bc07983c657 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.545166] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created directory with path [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 810.545493] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Fetch image to [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 810.546275] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 810.546733] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbd53b7-87c6-4fa2-b5c5-2213131971f0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.554906] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e815e0c4-ae66-4dcd-9ae9-66bbc71d5265 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.566995] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-448e6ad4-3e1e-4b8e-aed3-9063df416b75 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.612105] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93fc501f-cab0-4dca-a95c-272d88971c02 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.618810] env[60044]: DEBUG oslo_vmware.api [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Task: {'id': task-2204759, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076895} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 810.619788] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 810.620255] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 810.620363] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 810.620533] env[60044]: INFO nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Took 0.63 seconds to destroy the instance on the hypervisor. [ 810.622683] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1bf74f8f-cf60-476c-8d68-a7649b97da8f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.626131] env[60044]: DEBUG nova.compute.claims [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 810.626131] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.626131] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.657110] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 810.735673] env[60044]: DEBUG oslo_vmware.rw_handles [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 810.799193] env[60044]: DEBUG oslo_vmware.rw_handles [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 810.799384] env[60044]: DEBUG oslo_vmware.rw_handles [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 810.937206] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4da24d9-d307-4cad-9780-61c265b10534 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.945533] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-690252fe-a529-401f-b03b-8eadef354a21 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.977417] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec5d56b7-38a7-490f-bd3f-e3650ea0da88 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.985405] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aacd259-d68f-49d2-b3da-f5bf8bf7ea75 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.000457] env[60044]: DEBUG nova.compute.provider_tree [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.017133] env[60044]: DEBUG nova.scheduler.client.report [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.036708] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.410s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.036708] env[60044]: ERROR nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 811.036708] env[60044]: Faults: ['InvalidArgument'] [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Traceback (most recent call last): [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self.driver.spawn(context, instance, image_meta, [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 811.036708] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self._fetch_image_if_missing(context, vi) [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] image_cache(vi, tmp_image_ds_loc) [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] vm_util.copy_virtual_disk( [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] session._wait_for_task(vmdk_copy_task) [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return self.wait_for_task(task_ref) [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return evt.wait() [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] result = hub.switch() [ 811.037067] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] return self.greenlet.switch() [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] self.f(*self.args, **self.kw) [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] raise exceptions.translate_fault(task_info.error) [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Faults: ['InvalidArgument'] [ 811.037618] env[60044]: ERROR nova.compute.manager [instance: 23984fc7-95de-43c3-a21e-894fab241dce] [ 811.037618] env[60044]: DEBUG nova.compute.utils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 811.041161] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Build of instance 23984fc7-95de-43c3-a21e-894fab241dce was re-scheduled: A specified parameter was not correct: fileType [ 811.041161] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 811.041262] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 811.041929] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 811.041929] env[60044]: DEBUG nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 811.041929] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.683218] env[60044]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.701824] env[60044]: INFO nova.compute.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Took 0.66 seconds to deallocate network for instance. [ 811.835837] env[60044]: INFO nova.scheduler.client.report [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Deleted allocations for instance 23984fc7-95de-43c3-a21e-894fab241dce [ 811.872725] env[60044]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 293.688s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.873808] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 95.630s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.874035] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "23984fc7-95de-43c3-a21e-894fab241dce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 811.874236] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.874397] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.879661] env[60044]: INFO nova.compute.manager [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Terminating instance [ 811.882536] env[60044]: DEBUG nova.compute.manager [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 811.882830] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 811.883506] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-91b53389-de52-4047-90da-610dc522abec {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.894329] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38904c9f-78e8-48a0-823b-e41b5441b977 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.907185] env[60044]: DEBUG nova.compute.manager [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] [instance: b62dda0a-da1d-4109-a925-bb32d01da242] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 811.933962] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 23984fc7-95de-43c3-a21e-894fab241dce could not be found. [ 811.934363] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 811.936018] env[60044]: INFO nova.compute.manager [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Took 0.05 seconds to destroy the instance on the hypervisor. [ 811.936018] env[60044]: DEBUG oslo.service.loopingcall [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 811.936018] env[60044]: DEBUG nova.compute.manager [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 811.936018] env[60044]: DEBUG nova.network.neutron [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.938307] env[60044]: DEBUG nova.compute.manager [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] [instance: b62dda0a-da1d-4109-a925-bb32d01da242] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 811.962174] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "b62dda0a-da1d-4109-a925-bb32d01da242" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.456s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.984381] env[60044]: DEBUG nova.compute.manager [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] [instance: 6604de35-7683-4d5d-ac6f-13752ccb940c] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.013821] env[60044]: DEBUG nova.compute.manager [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] [instance: 6604de35-7683-4d5d-ac6f-13752ccb940c] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.045909] env[60044]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "6604de35-7683-4d5d-ac6f-13752ccb940c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.937s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.060912] env[60044]: DEBUG nova.compute.manager [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 189903f4-37c9-4331-bb23-245ed68ecaae] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.087520] env[60044]: DEBUG nova.compute.manager [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 189903f4-37c9-4331-bb23-245ed68ecaae] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.118466] env[60044]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "189903f4-37c9-4331-bb23-245ed68ecaae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.591s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.134291] env[60044]: DEBUG nova.compute.manager [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] [instance: 75cc0c18-27d3-4074-897b-08812a11829c] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.165036] env[60044]: DEBUG nova.compute.manager [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] [instance: 75cc0c18-27d3-4074-897b-08812a11829c] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.172959] env[60044]: DEBUG nova.network.neutron [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.192123] env[60044]: INFO nova.compute.manager [-] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Took 0.26 seconds to deallocate network for instance. [ 812.198339] env[60044]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "75cc0c18-27d3-4074-897b-08812a11829c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.887s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.208987] env[60044]: DEBUG nova.compute.manager [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] [instance: 22aa54d4-80ec-4d56-9239-41810c469b9e] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.273606] env[60044]: DEBUG nova.compute.manager [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] [instance: 22aa54d4-80ec-4d56-9239-41810c469b9e] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.306945] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "22aa54d4-80ec-4d56-9239-41810c469b9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.981s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.317734] env[60044]: DEBUG nova.compute.manager [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 885fe65d-ee02-4ed7-8d59-109775086038] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.368600] env[60044]: DEBUG nova.compute.manager [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 885fe65d-ee02-4ed7-8d59-109775086038] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.383187] env[60044]: DEBUG oslo_concurrency.lockutils [None req-973ea549-8a3c-46c9-ae4d-872c6ac2ad39 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "23984fc7-95de-43c3-a21e-894fab241dce" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.509s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.398928] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "885fe65d-ee02-4ed7-8d59-109775086038" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.641s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.409311] env[60044]: DEBUG nova.compute.manager [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: ec879414-4534-4d0e-a65e-65baff80b16e] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.443530] env[60044]: DEBUG nova.compute.manager [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: ec879414-4534-4d0e-a65e-65baff80b16e] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.469614] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "ec879414-4534-4d0e-a65e-65baff80b16e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.513s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.481788] env[60044]: DEBUG nova.compute.manager [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: cde76b14-ee01-44c8-8004-39cdf91e9889] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.519436] env[60044]: DEBUG nova.compute.manager [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: cde76b14-ee01-44c8-8004-39cdf91e9889] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.541416] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "cde76b14-ee01-44c8-8004-39cdf91e9889" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.555s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.554125] env[60044]: DEBUG nova.compute.manager [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: ca732c56-b1d1-40bf-96b6-4b93bc5ff29d] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.589028] env[60044]: DEBUG nova.compute.manager [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: ca732c56-b1d1-40bf-96b6-4b93bc5ff29d] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.620158] env[60044]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "ca732c56-b1d1-40bf-96b6-4b93bc5ff29d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.189s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.636699] env[60044]: DEBUG nova.compute.manager [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] [instance: 6f0c0004-7fd2-49bf-bb1e-48774c481497] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.675866] env[60044]: DEBUG nova.compute.manager [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] [instance: 6f0c0004-7fd2-49bf-bb1e-48774c481497] Instance disappeared before build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 812.710247] env[60044]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "6f0c0004-7fd2-49bf-bb1e-48774c481497" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.039s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.724441] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 812.795057] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 812.795140] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 812.796928] env[60044]: INFO nova.compute.claims [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 813.014372] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b098fa5-a40c-4a13-b322-d1331f59180b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.030512] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90f0fd60-3048-4c55-b500-79414ad4b93b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.064201] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89ddf5a0-06bb-4713-b98d-72fd80f28b3f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.074022] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5653d00-5290-445b-babd-fcd99b62f2ac {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.093325] env[60044]: DEBUG nova.compute.provider_tree [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 813.108225] env[60044]: DEBUG nova.scheduler.client.report [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 813.133465] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.135536] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 813.193020] env[60044]: DEBUG nova.compute.utils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 813.195111] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 813.195345] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 813.211635] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 813.313119] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 813.340734] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 813.340975] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 813.341157] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 813.341335] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 813.341478] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 813.341617] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 813.341885] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 813.342123] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 813.342301] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 813.342460] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 813.342626] env[60044]: DEBUG nova.virt.hardware [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 813.343484] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059bee16-a812-489d-8903-6dda3966563f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.353189] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1efd115c-3f8c-4cbb-a025-6b72bd767c74 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.407902] env[60044]: DEBUG nova.policy [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4817f779374d427f8a2ad8e25b0d97f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '50469143b9b441119f5bfcff560f3a9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 814.445275] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Successfully created port: 3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 816.027038] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Successfully updated port: 3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 816.046374] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 816.046634] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquired lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 816.047559] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.195198] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.564336] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Updating instance_info_cache with network_info: [{"id": "3a23457b-ddde-466a-8016-10ae2fdbad20", "address": "fa:16:3e:53:59:e2", "network": {"id": "b179d623-743e-4167-851e-6bb5cdf943dd", "bridge": "br-int", "label": "tempest-ServersTestJSON-718392761-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50469143b9b441119f5bfcff560f3a9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a23457b-dd", "ovs_interfaceid": "3a23457b-ddde-466a-8016-10ae2fdbad20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.581145] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Releasing lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 816.581543] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance network_info: |[{"id": "3a23457b-ddde-466a-8016-10ae2fdbad20", "address": "fa:16:3e:53:59:e2", "network": {"id": "b179d623-743e-4167-851e-6bb5cdf943dd", "bridge": "br-int", "label": "tempest-ServersTestJSON-718392761-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50469143b9b441119f5bfcff560f3a9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a23457b-dd", "ovs_interfaceid": "3a23457b-ddde-466a-8016-10ae2fdbad20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 816.582168] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:59:e2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0ea0fc1b-0424-46ec-bef5-6b57b7d184d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a23457b-ddde-466a-8016-10ae2fdbad20', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 816.599567] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Creating folder: Project (50469143b9b441119f5bfcff560f3a9c). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.600263] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26200ecf-e824-4ab7-9319-aac48963d552 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.617019] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Created folder: Project (50469143b9b441119f5bfcff560f3a9c) in parent group-v449562. [ 816.617019] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Creating folder: Instances. Parent ref: group-v449614. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.617019] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d2d558a-06a6-414f-8853-fb1323d259e3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.625543] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Created folder: Instances in parent group-v449614. [ 816.628024] env[60044]: DEBUG oslo.service.loopingcall [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 816.628024] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 816.628024] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-67c5a69c-a6ef-4af3-9186-e15dd0558ec1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.650228] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 816.650228] env[60044]: value = "task-2204762" [ 816.650228] env[60044]: _type = "Task" [ 816.650228] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 816.657619] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204762, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 816.911140] env[60044]: DEBUG nova.compute.manager [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Received event network-vif-plugged-3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 816.911140] env[60044]: DEBUG oslo_concurrency.lockutils [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] Acquiring lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.911140] env[60044]: DEBUG oslo_concurrency.lockutils [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] Lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.911140] env[60044]: DEBUG oslo_concurrency.lockutils [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] Lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.911445] env[60044]: DEBUG nova.compute.manager [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] No waiting events found dispatching network-vif-plugged-3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 816.911852] env[60044]: WARNING nova.compute.manager [req-11e38ef0-be24-41ef-86b8-407fcfe6486e req-e36de6d7-8848-4680-a192-d485acf0f784 service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Received unexpected event network-vif-plugged-3a23457b-ddde-466a-8016-10ae2fdbad20 for instance with vm_state building and task_state spawning. [ 817.161373] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204762, 'name': CreateVM_Task, 'duration_secs': 0.303505} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 817.161759] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 817.162321] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 817.162702] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 817.162783] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 817.163308] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e69923d-b37a-4f84-9876-80bebab3bb7d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 817.168308] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Waiting for the task: (returnval){ [ 817.168308] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52bcd23b-ac60-c9b7-fcc3-b1ed9cc4c70c" [ 817.168308] env[60044]: _type = "Task" [ 817.168308] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 817.187198] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 817.187871] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 817.187871] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 819.405589] env[60044]: DEBUG nova.compute.manager [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Received event network-changed-3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 819.405872] env[60044]: DEBUG nova.compute.manager [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Refreshing instance network info cache due to event network-changed-3a23457b-ddde-466a-8016-10ae2fdbad20. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 819.405990] env[60044]: DEBUG oslo_concurrency.lockutils [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] Acquiring lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 819.406200] env[60044]: DEBUG oslo_concurrency.lockutils [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] Acquired lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 819.406463] env[60044]: DEBUG nova.network.neutron [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Refreshing network info cache for port 3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 819.741140] env[60044]: DEBUG nova.network.neutron [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Updated VIF entry in instance network info cache for port 3a23457b-ddde-466a-8016-10ae2fdbad20. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 819.741140] env[60044]: DEBUG nova.network.neutron [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Updating instance_info_cache with network_info: [{"id": "3a23457b-ddde-466a-8016-10ae2fdbad20", "address": "fa:16:3e:53:59:e2", "network": {"id": "b179d623-743e-4167-851e-6bb5cdf943dd", "bridge": "br-int", "label": "tempest-ServersTestJSON-718392761-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50469143b9b441119f5bfcff560f3a9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0ea0fc1b-0424-46ec-bef5-6b57b7d184d8", "external-id": "nsx-vlan-transportzone-618", "segmentation_id": 618, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a23457b-dd", "ovs_interfaceid": "3a23457b-ddde-466a-8016-10ae2fdbad20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.751409] env[60044]: DEBUG oslo_concurrency.lockutils [req-c08ccc3a-79ee-4c1e-a88f-7de5b0fe63b6 req-4e3e7c31-46cc-4845-b288-15df97b5a27f service nova] Releasing lock "refresh_cache-0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 824.557610] env[60044]: DEBUG nova.compute.manager [req-6b110c54-3d48-42d0-81bd-66fe6b8b680e req-232ae3dc-ca85-4386-8665-5188043034f2 service nova] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Received event network-vif-deleted-3a23457b-ddde-466a-8016-10ae2fdbad20 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 828.018175] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 828.018740] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 830.014571] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 830.039652] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 830.040837] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 831.021681] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 831.021681] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 831.021681] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 831.021681] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 831.052115] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 831.052115] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 831.052115] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 831.052115] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 831.052115] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 831.052308] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 831.052308] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 831.060448] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 831.060659] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 831.060846] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 831.061054] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 831.062106] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cf960b5-db0c-435d-91df-2abb10c342cf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.070854] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9b1093-bad5-4332-9c59-835776300f2d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.085163] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeefe1a9-3074-4cbb-9ba4-1af496f500a4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.091678] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46fdc635-8ae7-41c8-bc8e-b8485bbec8bc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.122500] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181251MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 831.122649] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 831.122841] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 831.191690] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 831.191849] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 831.191977] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 831.192113] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 831.192232] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 831.203442] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance e84f3fe9-d377-4018-8874-972d1f888208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 831.247820] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 831.248050] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 831.248199] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 831.372721] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d37560-d5d4-4962-9da9-9f50056de6d4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.380877] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1004758c-f68f-4140-afdf-ee96e5ebd985 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.414639] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5afdacfe-9b5d-494e-bc44-38b995b3c589 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.422463] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44c70d61-d2ba-4177-8a2e-710ebb7b7f3a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.436753] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 831.446787] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 831.464341] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 831.464903] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 832.434583] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 832.434892] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 833.019548] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 836.102853] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.103151] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 858.925094] env[60044]: WARNING oslo_vmware.rw_handles [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 858.925094] env[60044]: ERROR oslo_vmware.rw_handles [ 858.925909] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 858.927704] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 858.927988] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Copying Virtual Disk [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/8ca147f9-e75d-4bdb-8aed-11c089ea053a/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 858.928327] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1b7e75af-1b4e-4e02-8fb2-7fe918be62ac {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.936607] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 858.936607] env[60044]: value = "task-2204763" [ 858.936607] env[60044]: _type = "Task" [ 858.936607] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 858.945191] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': task-2204763, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 859.446795] env[60044]: DEBUG oslo_vmware.exceptions [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 859.447014] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 859.447516] env[60044]: ERROR nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.447516] env[60044]: Faults: ['InvalidArgument'] [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Traceback (most recent call last): [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] yield resources [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self.driver.spawn(context, instance, image_meta, [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self._fetch_image_if_missing(context, vi) [ 859.447516] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] image_cache(vi, tmp_image_ds_loc) [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] vm_util.copy_virtual_disk( [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] session._wait_for_task(vmdk_copy_task) [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return self.wait_for_task(task_ref) [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return evt.wait() [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] result = hub.switch() [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 859.448037] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return self.greenlet.switch() [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self.f(*self.args, **self.kw) [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] raise exceptions.translate_fault(task_info.error) [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Faults: ['InvalidArgument'] [ 859.448439] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] [ 859.448439] env[60044]: INFO nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Terminating instance [ 859.449386] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 859.449588] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 859.449834] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d728ad62-bd79-4838-8822-4554f28662dd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.451955] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 859.452169] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 859.452879] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c6930a-41dc-4851-a6c6-8d735b17170b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.459331] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 859.459553] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e7508316-2d18-4c32-80f5-f841913e2836 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.461530] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 859.461702] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 859.462652] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-66d0fdbd-46a9-433e-9fb6-dd9f8b3ef7b7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.467324] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for the task: (returnval){ [ 859.467324] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52dfedd9-349d-7296-09a2-6e6254ab18bc" [ 859.467324] env[60044]: _type = "Task" [ 859.467324] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 859.474484] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52dfedd9-349d-7296-09a2-6e6254ab18bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 859.528303] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 859.528513] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 859.528680] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Deleting the datastore file [datastore2] ce718fc3-6f75-49b9-8543-c953646ce0d9 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 859.528933] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0e53c6a0-bfbc-48d4-a44d-51895d755b42 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.534674] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 859.534674] env[60044]: value = "task-2204765" [ 859.534674] env[60044]: _type = "Task" [ 859.534674] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 859.541804] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': task-2204765, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 859.977440] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 859.977891] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Creating directory with path [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 859.977891] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ee15902-eba3-4ffa-a82f-220cc8991dbc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.989093] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Created directory with path [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 859.989291] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Fetch image to [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 859.989458] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 859.990180] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2792b95-f6e1-4ab2-8d41-72f30ec82690 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.996437] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab7b900c-692f-4ede-b8fd-53255a90a842 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.004944] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7852150b-dfd5-47b9-a1c0-784579b724ba {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.038357] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a3f305-fc7b-4c07-aa02-a2cf9ebdacb0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.046125] env[60044]: DEBUG oslo_vmware.api [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': task-2204765, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073546} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 860.047501] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 860.047687] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 860.047855] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 860.048032] env[60044]: INFO nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 860.049743] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-89f2b4f6-58ca-4a74-988e-8695428be27a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.051565] env[60044]: DEBUG nova.compute.claims [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 860.051737] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 860.051982] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 860.072963] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 860.119759] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 860.178893] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 860.179086] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 860.237906] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b3c589-8413-4f1f-a015-3e27bf70c133 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.245524] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3cefca-9c47-4e29-b26e-bbee1bafb887 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.276022] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f696d93-7bee-475f-93fc-1fa6cf811d68 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.282774] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf4a089f-2e34-42c4-901f-9ea8df5656af {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.295398] env[60044]: DEBUG nova.compute.provider_tree [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 860.303684] env[60044]: DEBUG nova.scheduler.client.report [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 860.319587] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.268s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.320426] env[60044]: ERROR nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.320426] env[60044]: Faults: ['InvalidArgument'] [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Traceback (most recent call last): [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self.driver.spawn(context, instance, image_meta, [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self._fetch_image_if_missing(context, vi) [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] image_cache(vi, tmp_image_ds_loc) [ 860.320426] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] vm_util.copy_virtual_disk( [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] session._wait_for_task(vmdk_copy_task) [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return self.wait_for_task(task_ref) [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return evt.wait() [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] result = hub.switch() [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] return self.greenlet.switch() [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 860.320817] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] self.f(*self.args, **self.kw) [ 860.321205] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 860.321205] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] raise exceptions.translate_fault(task_info.error) [ 860.321205] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 860.321205] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Faults: ['InvalidArgument'] [ 860.321205] env[60044]: ERROR nova.compute.manager [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] [ 860.321205] env[60044]: DEBUG nova.compute.utils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 860.322230] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Build of instance ce718fc3-6f75-49b9-8543-c953646ce0d9 was re-scheduled: A specified parameter was not correct: fileType [ 860.322230] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 860.322596] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 860.322762] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 860.322927] env[60044]: DEBUG nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 860.323113] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.606818] env[60044]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.617500] env[60044]: INFO nova.compute.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Took 0.29 seconds to deallocate network for instance. [ 860.707148] env[60044]: INFO nova.scheduler.client.report [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Deleted allocations for instance ce718fc3-6f75-49b9-8543-c953646ce0d9 [ 860.723797] env[60044]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 337.975s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.724295] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 140.053s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 860.724521] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 860.724720] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 860.724880] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.726938] env[60044]: INFO nova.compute.manager [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Terminating instance [ 860.728582] env[60044]: DEBUG nova.compute.manager [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 860.728766] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 860.729262] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f56edb85-a034-40cd-a4e2-4fbd64e92f2c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.737980] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55064537-329d-45aa-9873-b85bcc0f4ac5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.748940] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 860.770242] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce718fc3-6f75-49b9-8543-c953646ce0d9 could not be found. [ 860.770424] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 860.772011] env[60044]: INFO nova.compute.manager [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 860.772011] env[60044]: DEBUG oslo.service.loopingcall [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 860.772011] env[60044]: DEBUG nova.compute.manager [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 860.772011] env[60044]: DEBUG nova.network.neutron [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.792832] env[60044]: DEBUG nova.network.neutron [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.794410] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 860.794631] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 860.796404] env[60044]: INFO nova.compute.claims [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 860.802845] env[60044]: INFO nova.compute.manager [-] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Took 0.03 seconds to deallocate network for instance. [ 860.884077] env[60044]: DEBUG oslo_concurrency.lockutils [None req-9df83d28-f286-4e1d-bd7a-eccbf738645e tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "ce718fc3-6f75-49b9-8543-c953646ce0d9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.160s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.936773] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fac91699-16e8-487d-bfb8-1c889acc50f4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.944534] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2602ddf-906d-43c9-8bd7-ec854fafde2a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.974047] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15dbb2e9-91d0-499e-bf37-8206daf230ad {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.980648] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43a7a6d3-ac4c-4e0e-a683-f502d0a8b930 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.993133] env[60044]: DEBUG nova.compute.provider_tree [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 861.001798] env[60044]: DEBUG nova.scheduler.client.report [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 861.014055] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 861.014555] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 861.044626] env[60044]: DEBUG nova.compute.utils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 861.046115] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 861.046601] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 861.055724] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 861.112366] env[60044]: DEBUG nova.policy [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b20a4b99c3041d986483e1c4d1cbe79', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a07d0346e8884cf394bb87ea702ec039', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 861.115412] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 861.136227] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 861.136227] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 861.136423] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 861.136892] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 861.136892] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 861.136892] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 861.137125] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 861.137474] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 861.137474] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 861.137655] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 861.137655] env[60044]: DEBUG nova.virt.hardware [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 861.138613] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-111c06f1-3180-4d4d-973c-2c9d260759b7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 861.146126] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-235d3b63-30d7-48d1-b038-f3afc36355b9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 861.384267] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Successfully created port: 31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 862.023795] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Successfully updated port: 31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 862.035738] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 862.035738] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 862.035738] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 862.065804] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 862.384081] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Updating instance_info_cache with network_info: [{"id": "31775527-1500-4e26-a5d5-cbcf0b43734d", "address": "fa:16:3e:9a:94:85", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31775527-15", "ovs_interfaceid": "31775527-1500-4e26-a5d5-cbcf0b43734d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.396783] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 862.397094] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance network_info: |[{"id": "31775527-1500-4e26-a5d5-cbcf0b43734d", "address": "fa:16:3e:9a:94:85", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31775527-15", "ovs_interfaceid": "31775527-1500-4e26-a5d5-cbcf0b43734d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 862.397433] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9a:94:85', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31775527-1500-4e26-a5d5-cbcf0b43734d', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 862.404857] env[60044]: DEBUG oslo.service.loopingcall [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 862.405322] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 862.405541] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d031a676-13fc-4660-8487-aff447741d8d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.425245] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 862.425245] env[60044]: value = "task-2204766" [ 862.425245] env[60044]: _type = "Task" [ 862.425245] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 862.432895] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204766, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 862.682286] env[60044]: DEBUG nova.compute.manager [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Received event network-vif-plugged-31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 862.682286] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Acquiring lock "e84f3fe9-d377-4018-8874-972d1f888208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.682613] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Lock "e84f3fe9-d377-4018-8874-972d1f888208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.683197] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Lock "e84f3fe9-d377-4018-8874-972d1f888208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 862.683754] env[60044]: DEBUG nova.compute.manager [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] No waiting events found dispatching network-vif-plugged-31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 862.684180] env[60044]: WARNING nova.compute.manager [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Received unexpected event network-vif-plugged-31775527-1500-4e26-a5d5-cbcf0b43734d for instance with vm_state building and task_state spawning. [ 862.684614] env[60044]: DEBUG nova.compute.manager [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Received event network-changed-31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 862.685044] env[60044]: DEBUG nova.compute.manager [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Refreshing instance network info cache due to event network-changed-31775527-1500-4e26-a5d5-cbcf0b43734d. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 862.685493] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Acquiring lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 862.685877] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Acquired lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 862.686324] env[60044]: DEBUG nova.network.neutron [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Refreshing network info cache for port 31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 862.926871] env[60044]: DEBUG nova.network.neutron [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Updated VIF entry in instance network info cache for port 31775527-1500-4e26-a5d5-cbcf0b43734d. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 862.927256] env[60044]: DEBUG nova.network.neutron [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Updating instance_info_cache with network_info: [{"id": "31775527-1500-4e26-a5d5-cbcf0b43734d", "address": "fa:16:3e:9a:94:85", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.130", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31775527-15", "ovs_interfaceid": "31775527-1500-4e26-a5d5-cbcf0b43734d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.938030] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204766, 'name': CreateVM_Task, 'duration_secs': 0.292415} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 862.938999] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 862.939452] env[60044]: DEBUG oslo_concurrency.lockutils [req-5a727ab6-0f91-4faf-b7de-21fdc7f9cfc8 req-91038e6c-e4c7-4053-bbd0-a22ceb904755 service nova] Releasing lock "refresh_cache-e84f3fe9-d377-4018-8874-972d1f888208" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 862.940183] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 862.940338] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 862.940659] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 862.941109] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4a763ec-d990-4f15-8496-241cb3ebd9a9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.946151] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 862.946151] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52af0350-bb71-e21f-79dc-400710419242" [ 862.946151] env[60044]: _type = "Task" [ 862.946151] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 862.954215] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52af0350-bb71-e21f-79dc-400710419242, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.297281] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.297575] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.456566] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 863.456765] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 863.456966] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 888.019969] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 889.019108] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 890.018892] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 891.019769] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 891.029707] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 891.029936] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 891.030121] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 891.030281] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 891.031391] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01c40ff2-6e64-41a6-b4cd-cd6172e6d8f8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.040482] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae06ebc-9b8a-4cc1-934c-2b18adc88fdf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.054499] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc6eb157-ee0a-4eb9-81fd-f57455c00887 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.060915] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e17099c0-9c49-4a55-9ee6-221ad51b5ee5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.089415] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181198MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 891.089620] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 891.089850] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 891.139627] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 891.139778] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 891.139903] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 891.140036] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 891.140158] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance e84f3fe9-d377-4018-8874-972d1f888208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 891.151417] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 891.161477] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 891.170945] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 891.171171] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 891.171318] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 891.269093] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e55034fc-c9d9-4eaa-be87-04c2046439a4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.276489] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17088c31-0ec5-4e20-abdd-af7b1a3ef248 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.306584] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8c8c2d-d444-4da5-8b5b-e1fb7bab234c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.313204] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb0c029-34be-420c-8d13-35f157135b90 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.325878] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 891.333865] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 891.346442] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 891.346636] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 892.340545] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 892.340841] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 892.340941] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 892.341098] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 893.019844] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 893.020073] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 893.020163] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 893.034242] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 893.034396] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 893.034568] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 893.034665] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 893.034786] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 893.034904] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 893.035332] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.329679] env[60044]: WARNING oslo_vmware.rw_handles [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 907.329679] env[60044]: ERROR oslo_vmware.rw_handles [ 907.330453] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 907.332039] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 907.332346] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Copying Virtual Disk [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/3e44d50a-5055-4fa6-93ba-6139c1153cdc/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 907.332697] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-05c29818-7dd0-4a0d-9bef-2c42d8ab4894 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.340438] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for the task: (returnval){ [ 907.340438] env[60044]: value = "task-2204767" [ 907.340438] env[60044]: _type = "Task" [ 907.340438] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 907.349253] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Task: {'id': task-2204767, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.851273] env[60044]: DEBUG oslo_vmware.exceptions [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 907.851514] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 907.852087] env[60044]: ERROR nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.852087] env[60044]: Faults: ['InvalidArgument'] [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Traceback (most recent call last): [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] yield resources [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self.driver.spawn(context, instance, image_meta, [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self._fetch_image_if_missing(context, vi) [ 907.852087] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] image_cache(vi, tmp_image_ds_loc) [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] vm_util.copy_virtual_disk( [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] session._wait_for_task(vmdk_copy_task) [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return self.wait_for_task(task_ref) [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return evt.wait() [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] result = hub.switch() [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 907.852717] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return self.greenlet.switch() [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self.f(*self.args, **self.kw) [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] raise exceptions.translate_fault(task_info.error) [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Faults: ['InvalidArgument'] [ 907.853271] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] [ 907.853271] env[60044]: INFO nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Terminating instance [ 907.854568] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.854568] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 907.854568] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ee26edac-fd2e-4002-b4ef-65f629a2cd70 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.856413] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 907.856570] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.856760] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 907.863888] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 907.864102] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 907.865385] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fe9f91c-2e2b-4103-830d-782d298c6129 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.872704] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for the task: (returnval){ [ 907.872704] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52b68385-cbb0-3d42-292f-a3089741c8c6" [ 907.872704] env[60044]: _type = "Task" [ 907.872704] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 907.881016] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52b68385-cbb0-3d42-292f-a3089741c8c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.888459] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 907.945716] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.955177] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Releasing lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 907.955631] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 907.955887] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 907.956970] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f161c508-29b5-4340-b3f7-ebf7ae902e3c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.964916] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 907.965146] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e386c752-6e3f-4a82-8b07-ae45b5fb2bf9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.001907] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 908.002163] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 908.002334] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Deleting the datastore file [datastore2] ae25fbd0-3770-43fc-9850-cdb2065b5ce3 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 908.002589] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8c07579e-9717-41da-8089-4009749ff55d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.008653] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for the task: (returnval){ [ 908.008653] env[60044]: value = "task-2204769" [ 908.008653] env[60044]: _type = "Task" [ 908.008653] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 908.016130] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Task: {'id': task-2204769, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 908.383375] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 908.383795] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Creating directory with path [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 908.383929] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-667924d4-ce9b-4114-bee7-71e453ba5c55 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.394499] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Created directory with path [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 908.394683] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Fetch image to [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 908.394844] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 908.395557] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a05c6b9-d960-4814-84b7-93ed6ba2234b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.401987] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b51cfc-5b95-4ed0-b618-b8df25886fca {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.410495] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e2de14a-eaff-4b73-a92c-80049e7dca20 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.439501] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94b320f6-d79e-4b99-a5a5-b01d25a94293 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.444496] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-77caed9d-f911-4bca-aec0-bdc35982927a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.462317] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 908.506515] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 908.563918] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 908.564120] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 908.567569] env[60044]: DEBUG oslo_vmware.api [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Task: {'id': task-2204769, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.030975} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 908.567794] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 908.567969] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 908.568155] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 908.568325] env[60044]: INFO nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 908.568612] env[60044]: DEBUG oslo.service.loopingcall [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 908.568829] env[60044]: DEBUG nova.compute.manager [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network deallocation for instance since networking was not requested. {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 908.570994] env[60044]: DEBUG nova.compute.claims [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 908.571178] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.571380] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.709503] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2b81838-72a9-4715-aaac-436df5a68a56 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.717237] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd092652-e130-4b77-bf0d-7700a392d305 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.746661] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7687cd-dcc3-4921-a89f-acb9b6dbbfdd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.754229] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cabae39b-2d17-4274-99f1-5cb587ade79d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.767098] env[60044]: DEBUG nova.compute.provider_tree [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 908.775643] env[60044]: DEBUG nova.scheduler.client.report [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 908.791148] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.220s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.791691] env[60044]: ERROR nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.791691] env[60044]: Faults: ['InvalidArgument'] [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Traceback (most recent call last): [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self.driver.spawn(context, instance, image_meta, [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self._fetch_image_if_missing(context, vi) [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] image_cache(vi, tmp_image_ds_loc) [ 908.791691] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] vm_util.copy_virtual_disk( [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] session._wait_for_task(vmdk_copy_task) [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return self.wait_for_task(task_ref) [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return evt.wait() [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] result = hub.switch() [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] return self.greenlet.switch() [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 908.792159] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] self.f(*self.args, **self.kw) [ 908.792584] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 908.792584] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] raise exceptions.translate_fault(task_info.error) [ 908.792584] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.792584] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Faults: ['InvalidArgument'] [ 908.792584] env[60044]: ERROR nova.compute.manager [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] [ 908.792584] env[60044]: DEBUG nova.compute.utils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 908.793885] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Build of instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 was re-scheduled: A specified parameter was not correct: fileType [ 908.793885] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 908.794302] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 908.794527] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 908.794670] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 908.794828] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 908.818520] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 908.880194] env[60044]: DEBUG nova.network.neutron [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 908.890932] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Releasing lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 908.891187] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 908.891368] env[60044]: DEBUG nova.compute.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Skipping network deallocation for instance since networking was not requested. {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 908.977571] env[60044]: INFO nova.scheduler.client.report [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Deleted allocations for instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 [ 908.998417] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 376.674s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.000009] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 175.931s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.000318] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 909.000608] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.000844] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.003633] env[60044]: INFO nova.compute.manager [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Terminating instance [ 909.005902] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 909.006172] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquired lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 909.006445] env[60044]: DEBUG nova.network.neutron [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 909.017798] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.036930] env[60044]: DEBUG nova.network.neutron [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 909.070831] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 909.071250] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.073496] env[60044]: INFO nova.compute.claims [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 909.104578] env[60044]: DEBUG nova.network.neutron [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 909.114222] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Releasing lock "refresh_cache-ae25fbd0-3770-43fc-9850-cdb2065b5ce3" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 909.114666] env[60044]: DEBUG nova.compute.manager [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 909.114853] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 909.115357] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-90261e81-495f-4f34-a2b8-e07145b9fdb8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.127585] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7f8eaaf-726c-4bd8-bd5f-5ec38072628f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.160687] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 could not be found. [ 909.160912] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 909.161072] env[60044]: INFO nova.compute.manager [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 909.161314] env[60044]: DEBUG oslo.service.loopingcall [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 909.164143] env[60044]: DEBUG nova.compute.manager [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 909.164308] env[60044]: DEBUG nova.network.neutron [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 909.182625] env[60044]: DEBUG nova.network.neutron [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 909.190152] env[60044]: DEBUG nova.network.neutron [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 909.198632] env[60044]: INFO nova.compute.manager [-] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Took 0.03 seconds to deallocate network for instance. [ 909.242231] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a85f5b5-75c7-4666-b952-2c35eae79f43 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.253627] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7daa0e40-43cc-4ca0-a7ab-05a84a01750d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.289330] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6975588-86a5-4b9a-9bb7-142d405d096f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.296772] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-332b8232-5fc4-45ae-b106-25211a26dcfe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.310829] env[60044]: DEBUG nova.compute.provider_tree [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.319104] env[60044]: DEBUG nova.scheduler.client.report [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.344181] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.344772] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 909.357594] env[60044]: DEBUG oslo_concurrency.lockutils [None req-4adae7d0-8953-44ee-a2d7-808576aac9f2 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "ae25fbd0-3770-43fc-9850-cdb2065b5ce3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.358s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.383691] env[60044]: DEBUG nova.compute.utils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 909.384988] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 909.385195] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 909.394400] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 909.467379] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 909.470944] env[60044]: DEBUG nova.policy [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '588d0c5d584544c3be2d880de2c00a37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7913858bdbbe4375917c0e1864ee8d2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 909.489677] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:05:26Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='bf42d29d-3b09-4eb4-947f-ff6ae6b39228',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-605856663',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 909.489924] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 909.490113] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 909.490259] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 909.490399] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 909.490542] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 909.490746] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 909.490902] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 909.492186] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 909.492768] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 909.492768] env[60044]: DEBUG nova.virt.hardware [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 909.493713] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41fe2755-4a72-46c5-8a07-34eeb1e60137 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.505084] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd260d00-c08c-4b86-be46-3873bfff0c27 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.775194] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Successfully created port: cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 910.278546] env[60044]: DEBUG nova.compute.manager [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Received event network-vif-plugged-cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 910.278761] env[60044]: DEBUG oslo_concurrency.lockutils [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] Acquiring lock "0d87148b-1493-4777-a8b3-b94a64e8eca6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.278961] env[60044]: DEBUG oslo_concurrency.lockutils [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] Lock "0d87148b-1493-4777-a8b3-b94a64e8eca6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.279137] env[60044]: DEBUG oslo_concurrency.lockutils [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] Lock "0d87148b-1493-4777-a8b3-b94a64e8eca6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.279299] env[60044]: DEBUG nova.compute.manager [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] No waiting events found dispatching network-vif-plugged-cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 910.279497] env[60044]: WARNING nova.compute.manager [req-5f37dffa-a556-4814-8415-d25d79bff32f req-9c3921a7-2646-486e-9b86-f52e12ae2813 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Received unexpected event network-vif-plugged-cded2776-0714-4c2f-8cfb-b0045f17aa01 for instance with vm_state building and task_state spawning. [ 910.358129] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Successfully updated port: cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 910.371776] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 910.371838] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 910.371975] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 910.411702] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 910.568993] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Updating instance_info_cache with network_info: [{"id": "cded2776-0714-4c2f-8cfb-b0045f17aa01", "address": "fa:16:3e:fd:78:f7", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcded2776-07", "ovs_interfaceid": "cded2776-0714-4c2f-8cfb-b0045f17aa01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 910.582623] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 910.582938] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance network_info: |[{"id": "cded2776-0714-4c2f-8cfb-b0045f17aa01", "address": "fa:16:3e:fd:78:f7", "network": {"id": "89333424-f877-469e-8334-886640813d5b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "6e30b729ea7246768c33961f1716d5e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcded2776-07", "ovs_interfaceid": "cded2776-0714-4c2f-8cfb-b0045f17aa01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 910.583324] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:78:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cded2776-0714-4c2f-8cfb-b0045f17aa01', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 910.591069] env[60044]: DEBUG oslo.service.loopingcall [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 910.591538] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 910.591762] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5a45fec-9e24-42a4-aea4-866471215a89 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.612228] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 910.612228] env[60044]: value = "task-2204770" [ 910.612228] env[60044]: _type = "Task" [ 910.612228] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 910.621291] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204770, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 911.122375] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204770, 'name': CreateVM_Task, 'duration_secs': 0.30528} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 911.122548] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 911.123233] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 911.123395] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 911.123695] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 911.123952] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-222a3910-aeda-40af-a5f5-f7c889fa398f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.128458] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 911.128458] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52fbfe69-4ad2-6fc3-680a-985a818a67fe" [ 911.128458] env[60044]: _type = "Task" [ 911.128458] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 911.136261] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52fbfe69-4ad2-6fc3-680a-985a818a67fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 911.639795] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 911.640110] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 911.640361] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 912.473229] env[60044]: DEBUG nova.compute.manager [req-e5e3bc36-1808-4dbf-86dd-8bbb63625973 req-94769e2c-447e-45a2-bdc3-ed3e01e35406 service nova] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Received event network-vif-deleted-31775527-1500-4e26-a5d5-cbcf0b43734d {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 912.492406] env[60044]: DEBUG nova.compute.manager [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Received event network-changed-cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 912.492406] env[60044]: DEBUG nova.compute.manager [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Refreshing instance network info cache due to event network-changed-cded2776-0714-4c2f-8cfb-b0045f17aa01. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 912.492406] env[60044]: DEBUG oslo_concurrency.lockutils [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] Acquiring lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 912.492406] env[60044]: DEBUG oslo_concurrency.lockutils [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] Acquired lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 912.492406] env[60044]: DEBUG nova.network.neutron [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Refreshing network info cache for port cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 912.525541] env[60044]: DEBUG nova.network.neutron [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 912.622318] env[60044]: DEBUG nova.network.neutron [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance is deleted, no further info cache update {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 912.622318] env[60044]: DEBUG oslo_concurrency.lockutils [req-c15dd97b-1e65-4c6d-a7f0-ecd91ef8cbd5 req-2d693ed7-d335-4ae7-8bf2-972912842c70 service nova] Releasing lock "refresh_cache-0d87148b-1493-4777-a8b3-b94a64e8eca6" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 914.504949] env[60044]: DEBUG nova.compute.manager [req-96be5ec6-c363-4eda-a664-174aa6096414 req-6d675410-9bce-42c7-9a87-a5f6fc4a0c06 service nova] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Received event network-vif-deleted-cded2776-0714-4c2f-8cfb-b0045f17aa01 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 949.018844] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 951.015128] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 951.029683] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 951.029891] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 952.019078] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 952.019078] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 952.019459] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 953.015246] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 953.018518] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 953.028163] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 953.028427] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.028525] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.028675] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 953.029743] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-943ac1f3-a9dc-4d0a-bcce-7f1e419b67c0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.038705] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fcece86-f8d2-4ce7-a1f4-7c0c5934a0fd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.052663] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-782fa6d8-7dd2-4c08-90dc-b2f20315767d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.061020] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9db0a6b-5d88-4a71-90f3-004cf7d1213b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.090273] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181227MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 953.090475] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 953.090758] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.134824] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 426f9016-4e69-4e46-87f6-a67f77da5dff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 953.134991] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 27836d31-f379-4b4b-aed1-155f4a947779 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 953.135138] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 953.147577] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 953.158030] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 953.158274] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 953.158446] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 953.232666] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8655836-0cf5-47fe-94f2-5ef7e9b6b3cd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.240663] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd237000-a0e7-4563-8dc9-3908bf60f376 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.271099] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5521f4b-2636-41d0-8c97-06e5d845bdc9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.278462] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b09fb40-c82f-48d1-81e7-e2ded89cdbee {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.293222] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 953.303304] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 953.319456] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 953.319659] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 955.319712] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 955.319991] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 955.320059] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 955.332409] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 955.332557] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 955.332686] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 955.332848] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 955.333262] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 958.435329] env[60044]: WARNING oslo_vmware.rw_handles [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 958.435329] env[60044]: ERROR oslo_vmware.rw_handles [ 958.435916] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 958.437783] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 958.438037] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Copying Virtual Disk [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/ee5ea26b-c0ec-4d8d-89fd-7a3039134f2d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 958.438333] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-57cbdb77-15e1-42ab-8fb8-d6ad41103d5c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.446825] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for the task: (returnval){ [ 958.446825] env[60044]: value = "task-2204771" [ 958.446825] env[60044]: _type = "Task" [ 958.446825] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 958.455119] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Task: {'id': task-2204771, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 958.956960] env[60044]: DEBUG oslo_vmware.exceptions [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 958.958037] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 958.958037] env[60044]: ERROR nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.958037] env[60044]: Faults: ['InvalidArgument'] [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Traceback (most recent call last): [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] yield resources [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self.driver.spawn(context, instance, image_meta, [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 958.958037] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self._fetch_image_if_missing(context, vi) [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] image_cache(vi, tmp_image_ds_loc) [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] vm_util.copy_virtual_disk( [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] session._wait_for_task(vmdk_copy_task) [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return self.wait_for_task(task_ref) [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return evt.wait() [ 958.958366] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] result = hub.switch() [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return self.greenlet.switch() [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self.f(*self.args, **self.kw) [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] raise exceptions.translate_fault(task_info.error) [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Faults: ['InvalidArgument'] [ 958.958823] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] [ 958.958823] env[60044]: INFO nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Terminating instance [ 958.959919] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 958.960032] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 958.960581] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 958.960765] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 958.960978] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5e90f57-5246-4b33-a5df-db5bbe9346e3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.963150] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f7638fd-71cb-4709-b604-fc712016b7df {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.969693] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 958.969885] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-24eb2c56-96ae-4a2e-b5eb-fc1fa867a32a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.971817] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 958.971982] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 958.972896] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-099d1d1a-578b-4551-ae81-3c5037c82e34 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.977421] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 958.977421] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]526f616e-9875-2548-dc60-ca8a531e7d78" [ 958.977421] env[60044]: _type = "Task" [ 958.977421] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 958.984174] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]526f616e-9875-2548-dc60-ca8a531e7d78, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 959.040750] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 959.040968] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 959.041215] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Deleting the datastore file [datastore2] 426f9016-4e69-4e46-87f6-a67f77da5dff {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 959.041513] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e376327c-73f2-4512-b8b9-077111be7e9a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.047562] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for the task: (returnval){ [ 959.047562] env[60044]: value = "task-2204773" [ 959.047562] env[60044]: _type = "Task" [ 959.047562] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 959.054779] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Task: {'id': task-2204773, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 959.487713] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 959.488027] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating directory with path [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 959.488207] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac5546a8-c50d-492f-8a75-63da635249f5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.499259] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created directory with path [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 959.499454] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Fetch image to [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 959.499619] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 959.500370] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f3ad21-8939-436d-adfd-16c7bb9c06c8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.506937] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4208a418-1a16-4e42-8ee1-19096ce84e83 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.516702] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adfbcd8c-0113-4225-a519-2bffbbae7b92 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.545906] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d04ee3-fb94-4b7f-8521-6e29e1196a03 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.555968] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa1290c0-d108-4a2e-8998-732a3042d9ef {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.557566] env[60044]: DEBUG oslo_vmware.api [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Task: {'id': task-2204773, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071207} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 959.557791] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 959.557966] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 959.558148] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 959.558316] env[60044]: INFO nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Took 0.60 seconds to destroy the instance on the hypervisor. [ 959.560303] env[60044]: DEBUG nova.compute.claims [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 959.560462] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 959.560666] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.588919] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 959.637872] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 959.695008] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 959.695195] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 959.715804] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-368247f6-3f54-423e-8533-3bccd9b9f78b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.723095] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2852e5-234c-4e05-b4f0-46cf864e8823 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.752678] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df1a362e-52a3-4b08-a435-a785572bec36 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.759288] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ec2104c-7597-4b2b-afed-5a6dc56dc2a5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.771737] env[60044]: DEBUG nova.compute.provider_tree [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 959.779788] env[60044]: DEBUG nova.scheduler.client.report [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 959.792747] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.232s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.793292] env[60044]: ERROR nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 959.793292] env[60044]: Faults: ['InvalidArgument'] [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Traceback (most recent call last): [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self.driver.spawn(context, instance, image_meta, [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self._fetch_image_if_missing(context, vi) [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] image_cache(vi, tmp_image_ds_loc) [ 959.793292] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] vm_util.copy_virtual_disk( [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] session._wait_for_task(vmdk_copy_task) [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return self.wait_for_task(task_ref) [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return evt.wait() [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] result = hub.switch() [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] return self.greenlet.switch() [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 959.793585] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] self.f(*self.args, **self.kw) [ 959.793860] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 959.793860] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] raise exceptions.translate_fault(task_info.error) [ 959.793860] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 959.793860] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Faults: ['InvalidArgument'] [ 959.793860] env[60044]: ERROR nova.compute.manager [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] [ 959.793968] env[60044]: DEBUG nova.compute.utils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 959.795272] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Build of instance 426f9016-4e69-4e46-87f6-a67f77da5dff was re-scheduled: A specified parameter was not correct: fileType [ 959.795272] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 959.795636] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 959.795801] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 959.795950] env[60044]: DEBUG nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 959.796124] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 960.051737] env[60044]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 960.062024] env[60044]: INFO nova.compute.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Took 0.27 seconds to deallocate network for instance. [ 960.143070] env[60044]: INFO nova.scheduler.client.report [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Deleted allocations for instance 426f9016-4e69-4e46-87f6-a67f77da5dff [ 960.160449] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 431.206s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.161364] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 230.349s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.161574] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.161771] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.161931] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.164023] env[60044]: INFO nova.compute.manager [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Terminating instance [ 960.165911] env[60044]: DEBUG nova.compute.manager [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 960.166118] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 960.166551] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-75373ac9-137e-4402-b414-b735af9de6f9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.175508] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f816aff-484e-496e-9ce5-feec778a5e64 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.187547] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 960.208223] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 426f9016-4e69-4e46-87f6-a67f77da5dff could not be found. [ 960.208423] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 960.208595] env[60044]: INFO nova.compute.manager [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Took 0.04 seconds to destroy the instance on the hypervisor. [ 960.208833] env[60044]: DEBUG oslo.service.loopingcall [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 960.209046] env[60044]: DEBUG nova.compute.manager [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 960.209142] env[60044]: DEBUG nova.network.neutron [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 960.236646] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.237178] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.238668] env[60044]: INFO nova.compute.claims [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 960.240890] env[60044]: DEBUG nova.network.neutron [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 960.248813] env[60044]: INFO nova.compute.manager [-] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Took 0.04 seconds to deallocate network for instance. [ 960.348825] env[60044]: DEBUG oslo_concurrency.lockutils [None req-3eaeb65a-a7b7-4bb5-9adb-a5c55e0138d2 tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "426f9016-4e69-4e46-87f6-a67f77da5dff" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.359508] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b23c69dc-66f6-4625-8e3b-ee37f29327dd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.367210] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb6246b-c8fc-4ab6-8122-c00a3e7f2c4a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.396589] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d78dcdd-da9b-471d-947e-03855b0d7d42 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.403117] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9be577eb-0c85-4953-a02d-18cacbba4dd5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.416138] env[60044]: DEBUG nova.compute.provider_tree [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 960.424206] env[60044]: DEBUG nova.scheduler.client.report [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 960.436541] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.437010] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 960.467380] env[60044]: DEBUG nova.compute.utils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 960.468715] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 960.468889] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 960.479111] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 960.526570] env[60044]: DEBUG nova.policy [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '618b89e8d8134c66b8662bdf4ca06d5c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '581c2db844984c00bc0bad0475272109', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 960.540054] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 960.559831] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 960.560082] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 960.560238] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 960.560414] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 960.560556] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 960.560698] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 960.561230] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 960.561230] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 960.561230] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 960.561377] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 960.561528] env[60044]: DEBUG nova.virt.hardware [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 960.562379] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8b5548d-ed59-4792-b885-ce1a14bb695c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.570088] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a06535-f842-483e-9338-4e1f85ea01ea {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.819120] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Successfully created port: fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 961.341659] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Successfully updated port: fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 961.352152] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 961.352152] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquired lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 961.352152] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 961.382333] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 961.528268] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Updating instance_info_cache with network_info: [{"id": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "address": "fa:16:3e:04:38:b6", "network": {"id": "77b126d2-d8b9-4b6a-8915-eb1f9e681f47", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1028282048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "581c2db844984c00bc0bad0475272109", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc13b8a8-96", "ovs_interfaceid": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 961.539669] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Releasing lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 961.540135] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance network_info: |[{"id": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "address": "fa:16:3e:04:38:b6", "network": {"id": "77b126d2-d8b9-4b6a-8915-eb1f9e681f47", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1028282048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "581c2db844984c00bc0bad0475272109", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc13b8a8-96", "ovs_interfaceid": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 961.540336] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:04:38:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 961.547510] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Creating folder: Project (581c2db844984c00bc0bad0475272109). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 961.547971] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-822c5eff-cf25-4ebf-bbf0-42103b9c57b8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.557858] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Created folder: Project (581c2db844984c00bc0bad0475272109) in parent group-v449562. [ 961.558041] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Creating folder: Instances. Parent ref: group-v449619. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 961.558247] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dbaa1eb0-b2b1-4aec-9c68-5ee188a6c2f5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.566775] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Created folder: Instances in parent group-v449619. [ 961.566986] env[60044]: DEBUG oslo.service.loopingcall [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 961.567165] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 961.567335] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-649a2ff8-4762-42d7-ab18-5319e334afd4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.586125] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 961.586125] env[60044]: value = "task-2204776" [ 961.586125] env[60044]: _type = "Task" [ 961.586125] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 961.593260] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204776, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 962.089455] env[60044]: DEBUG nova.compute.manager [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Received event network-vif-plugged-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 962.089593] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Acquiring lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 962.089778] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 962.089936] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.090104] env[60044]: DEBUG nova.compute.manager [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] No waiting events found dispatching network-vif-plugged-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 962.090261] env[60044]: WARNING nova.compute.manager [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Received unexpected event network-vif-plugged-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b for instance with vm_state building and task_state spawning. [ 962.090412] env[60044]: DEBUG nova.compute.manager [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Received event network-changed-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 962.090559] env[60044]: DEBUG nova.compute.manager [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Refreshing instance network info cache due to event network-changed-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 962.090732] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Acquiring lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 962.090862] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Acquired lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.091017] env[60044]: DEBUG nova.network.neutron [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Refreshing network info cache for port fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 962.101134] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204776, 'name': CreateVM_Task, 'duration_secs': 0.291458} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 962.101698] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 962.103990] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 962.104158] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.104449] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 962.105018] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3679dd8d-441f-4486-b165-829cd2c41c70 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.109246] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Waiting for the task: (returnval){ [ 962.109246] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d0d443-acf7-3fbb-b0ae-c74cea1bd4f3" [ 962.109246] env[60044]: _type = "Task" [ 962.109246] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 962.116602] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52d0d443-acf7-3fbb-b0ae-c74cea1bd4f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 962.320511] env[60044]: DEBUG nova.network.neutron [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Updated VIF entry in instance network info cache for port fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 962.320852] env[60044]: DEBUG nova.network.neutron [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Updating instance_info_cache with network_info: [{"id": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "address": "fa:16:3e:04:38:b6", "network": {"id": "77b126d2-d8b9-4b6a-8915-eb1f9e681f47", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1028282048-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "581c2db844984c00bc0bad0475272109", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfc13b8a8-96", "ovs_interfaceid": "fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 962.329504] env[60044]: DEBUG oslo_concurrency.lockutils [req-10200f59-c852-4a1e-8526-0a1294ab4fce req-8a27f40c-34cc-4671-9827-b20c2dad0836 service nova] Releasing lock "refresh_cache-f3566a4b-8fe0-4c85-9c45-7c67cfd30323" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 962.620570] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 962.621036] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 962.621099] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.187856] env[60044]: WARNING oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1006.187856] env[60044]: ERROR oslo_vmware.rw_handles [ 1006.188496] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1006.190573] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1006.190902] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Copying Virtual Disk [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/02a6dc60-051a-4e29-a330-ba27e5062099/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1006.191286] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f50c6dc1-5f44-45e7-a6d1-69e7a5579308 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.199596] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 1006.199596] env[60044]: value = "task-2204777" [ 1006.199596] env[60044]: _type = "Task" [ 1006.199596] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.209147] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204777, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.709912] env[60044]: DEBUG oslo_vmware.exceptions [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1006.710203] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.710757] env[60044]: ERROR nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1006.710757] env[60044]: Faults: ['InvalidArgument'] [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Traceback (most recent call last): [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] yield resources [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self.driver.spawn(context, instance, image_meta, [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self._fetch_image_if_missing(context, vi) [ 1006.710757] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] image_cache(vi, tmp_image_ds_loc) [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] vm_util.copy_virtual_disk( [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] session._wait_for_task(vmdk_copy_task) [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return self.wait_for_task(task_ref) [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return evt.wait() [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] result = hub.switch() [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1006.711153] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return self.greenlet.switch() [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self.f(*self.args, **self.kw) [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] raise exceptions.translate_fault(task_info.error) [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Faults: ['InvalidArgument'] [ 1006.711459] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] [ 1006.711459] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Terminating instance [ 1006.712601] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.712806] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1006.713038] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0d2cf112-e28b-443f-93e8-710b8ac12ecf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.715341] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1006.715488] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1006.716202] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b86aeef3-2ac7-4bd4-941c-1d673ab240f7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.722341] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1006.722547] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e6a1fc8a-10db-4db4-9d05-dc07a26e7f4c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.724684] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1006.724851] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1006.725838] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2884159-bcb1-46ab-85a8-63592b3755e8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.730435] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 1006.730435] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]520b7f19-cdea-fd58-0a42-3544e094d17e" [ 1006.730435] env[60044]: _type = "Task" [ 1006.730435] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.737162] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]520b7f19-cdea-fd58-0a42-3544e094d17e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.799218] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1006.799434] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1006.799611] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleting the datastore file [datastore2] 27836d31-f379-4b4b-aed1-155f4a947779 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1006.799871] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8d6fe9eb-bf32-4b15-9c52-6a31cb80d583 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.805849] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 1006.805849] env[60044]: value = "task-2204779" [ 1006.805849] env[60044]: _type = "Task" [ 1006.805849] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.813250] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204779, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.240769] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1007.241107] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating directory with path [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1007.241240] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c7bba88-796e-494e-8689-187ad08417ac {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.252269] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Created directory with path [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1007.252428] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Fetch image to [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1007.252561] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1007.253256] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6a87d69-d66d-4605-acaf-38b0242991b4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.259519] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e822c32-ee86-4e8c-b02f-3a49a779f232 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.268176] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e099250-b92b-48bb-b2df-e2695dcc5413 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.297796] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f6e0dd9-7f51-4068-8c1c-d9779a3d9f37 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.302733] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-463f4eea-c7db-486b-9c90-dc328bfa8a68 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.314584] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204779, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064742} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1007.314796] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1007.314985] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1007.315189] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1007.315387] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1007.317419] env[60044]: DEBUG nova.compute.claims [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1007.317573] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1007.317799] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1007.323677] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1007.369510] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1007.426724] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1007.426906] env[60044]: DEBUG oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1007.469397] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0df60664-1050-48df-a307-efd320f7581a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.476440] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ec24476-0f18-4ad1-9921-a7075c853cee {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.505037] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cef31161-67b4-4d97-b067-2ceac02fde02 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.511356] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2743b9d6-62db-494b-a611-d6b3c11f4bc1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.523537] env[60044]: DEBUG nova.compute.provider_tree [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1007.531966] env[60044]: DEBUG nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1007.545024] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1007.545544] env[60044]: ERROR nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.545544] env[60044]: Faults: ['InvalidArgument'] [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Traceback (most recent call last): [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self.driver.spawn(context, instance, image_meta, [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self._fetch_image_if_missing(context, vi) [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] image_cache(vi, tmp_image_ds_loc) [ 1007.545544] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] vm_util.copy_virtual_disk( [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] session._wait_for_task(vmdk_copy_task) [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return self.wait_for_task(task_ref) [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return evt.wait() [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] result = hub.switch() [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] return self.greenlet.switch() [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1007.545840] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] self.f(*self.args, **self.kw) [ 1007.546425] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1007.546425] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] raise exceptions.translate_fault(task_info.error) [ 1007.546425] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1007.546425] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Faults: ['InvalidArgument'] [ 1007.546425] env[60044]: ERROR nova.compute.manager [instance: 27836d31-f379-4b4b-aed1-155f4a947779] [ 1007.546425] env[60044]: DEBUG nova.compute.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1007.547511] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Build of instance 27836d31-f379-4b4b-aed1-155f4a947779 was re-scheduled: A specified parameter was not correct: fileType [ 1007.547511] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1007.547875] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1007.548055] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1007.548232] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1007.548385] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1007.805333] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.816444] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Took 0.27 seconds to deallocate network for instance. [ 1007.901212] env[60044]: INFO nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleted allocations for instance 27836d31-f379-4b4b-aed1-155f4a947779 [ 1007.916638] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 474.151s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1007.917670] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 274.971s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1007.917969] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "27836d31-f379-4b4b-aed1-155f4a947779-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1007.918308] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1007.918569] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1007.921758] env[60044]: INFO nova.compute.manager [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Terminating instance [ 1007.923889] env[60044]: DEBUG nova.compute.manager [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1007.924089] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1007.924338] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-57150a7d-a6a3-4d93-ae57-3a2ce5a1666f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.934856] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9847179c-b634-4827-94fb-0f3ec59b8fdc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.945676] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1007.964366] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 27836d31-f379-4b4b-aed1-155f4a947779 could not be found. [ 1007.964569] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1007.964739] env[60044]: INFO nova.compute.manager [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1007.965062] env[60044]: DEBUG oslo.service.loopingcall [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1007.965239] env[60044]: DEBUG nova.compute.manager [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1007.965340] env[60044]: DEBUG nova.network.neutron [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1007.992317] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1007.992683] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1007.994359] env[60044]: INFO nova.compute.claims [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1007.998407] env[60044]: DEBUG nova.network.neutron [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.009086] env[60044]: INFO nova.compute.manager [-] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Took 0.04 seconds to deallocate network for instance. [ 1008.093741] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea909e9-e5b3-4e43-9bb5-bd27a9ecf28d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.099678] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eac98843-925e-4a46-9b9e-3f83cab2f233 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "27836d31-f379-4b4b-aed1-155f4a947779" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1008.103423] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a7fd15-821d-4546-9559-2eea50e4375a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.132826] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-163577fb-6b7a-4700-9882-6ef2a71f331f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.139881] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-013808e1-4da6-4a67-a81c-c129e5febbe4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.154205] env[60044]: DEBUG nova.compute.provider_tree [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1008.162143] env[60044]: DEBUG nova.scheduler.client.report [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1008.174040] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1008.174495] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1008.207427] env[60044]: DEBUG nova.compute.utils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1008.208729] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1008.208895] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1008.216081] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1008.260747] env[60044]: DEBUG nova.policy [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b786da2369eb45ab916b9e137d644dc8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb70c075cb2e4c44917d5ba6cb849786', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 1008.293871] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1008.313914] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1008.314181] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1008.314352] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1008.314535] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1008.314690] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1008.314836] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1008.315076] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1008.315245] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1008.315404] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1008.315559] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1008.315723] env[60044]: DEBUG nova.virt.hardware [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1008.316835] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a3b4414-52d7-400d-86b6-43d3dac7f1d3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.325168] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31d4bc2b-cd71-4d06-b4cc-40fdcd0c9418 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.543099] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Successfully created port: 4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1009.019163] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1009.053426] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Successfully updated port: 4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1009.065072] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.065286] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1009.065448] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1009.101269] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1009.250770] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Updating instance_info_cache with network_info: [{"id": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "address": "fa:16:3e:e9:0b:b4", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4cbfe223-af", "ovs_interfaceid": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.262205] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1009.262483] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance network_info: |[{"id": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "address": "fa:16:3e:e9:0b:b4", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4cbfe223-af", "ovs_interfaceid": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1009.262883] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e9:0b:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bd998416-f3d6-4a62-b828-5011063ce76a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4cbfe223-af4a-4e63-a600-d6e0ee204ee8', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1009.270746] env[60044]: DEBUG oslo.service.loopingcall [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1009.271227] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1009.271482] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5fa2a1c7-712a-4542-99a4-f18507ce8918 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1009.291695] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1009.291695] env[60044]: value = "task-2204780" [ 1009.291695] env[60044]: _type = "Task" [ 1009.291695] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1009.299628] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204780, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1009.801481] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204780, 'name': CreateVM_Task, 'duration_secs': 0.30153} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1009.801683] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1009.802284] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.802439] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1009.802750] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1009.802983] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-74670d2b-c7b9-4fc1-a340-386c4cda376e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1009.807674] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 1009.807674] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52c5cfa5-50c9-b830-5f16-99b1e6814b99" [ 1009.807674] env[60044]: _type = "Task" [ 1009.807674] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1009.816270] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52c5cfa5-50c9-b830-5f16-99b1e6814b99, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1009.842804] env[60044]: DEBUG nova.compute.manager [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Received event network-vif-plugged-4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1009.842974] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Acquiring lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1009.843194] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1009.843351] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1009.843504] env[60044]: DEBUG nova.compute.manager [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] No waiting events found dispatching network-vif-plugged-4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1009.843661] env[60044]: WARNING nova.compute.manager [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Received unexpected event network-vif-plugged-4cbfe223-af4a-4e63-a600-d6e0ee204ee8 for instance with vm_state building and task_state spawning. [ 1009.843807] env[60044]: DEBUG nova.compute.manager [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Received event network-changed-4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1009.843951] env[60044]: DEBUG nova.compute.manager [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Refreshing instance network info cache due to event network-changed-4cbfe223-af4a-4e63-a600-d6e0ee204ee8. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1009.844145] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Acquiring lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.844278] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Acquired lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1009.844422] env[60044]: DEBUG nova.network.neutron [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Refreshing network info cache for port 4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1010.225657] env[60044]: DEBUG nova.network.neutron [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Updated VIF entry in instance network info cache for port 4cbfe223-af4a-4e63-a600-d6e0ee204ee8. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1010.226021] env[60044]: DEBUG nova.network.neutron [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Updating instance_info_cache with network_info: [{"id": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "address": "fa:16:3e:e9:0b:b4", "network": {"id": "d8303e32-b5c8-45fb-a675-dcf0505feff5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-774580778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "eb70c075cb2e4c44917d5ba6cb849786", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd998416-f3d6-4a62-b828-5011063ce76a", "external-id": "nsx-vlan-transportzone-57", "segmentation_id": 57, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4cbfe223-af", "ovs_interfaceid": "4cbfe223-af4a-4e63-a600-d6e0ee204ee8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1010.234719] env[60044]: DEBUG oslo_concurrency.lockutils [req-4fb39d2f-4ec0-4d1d-8fe1-82b5b5e70013 req-51fde121-3f8b-47b0-932b-155a3e4590cc service nova] Releasing lock "refresh_cache-c0f7ff03-5203-418d-aa9e-420448e9dbfb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1010.318082] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1010.318378] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1010.318577] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1012.019066] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.014294] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.017910] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.018170] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.018415] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.018627] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1013.018846] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1013.028469] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1013.028804] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1013.029015] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1013.029244] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1013.030310] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bba87094-b59a-4e89-9dc0-fff90085182d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.038969] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f12bcc43-f470-47bd-9db9-6c156723113f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.052036] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b2ea201-3185-4366-84c4-6f7c83570a34 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.058045] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9760c633-649e-4fb9-b6f1-7f37c5c190c0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.086966] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181237MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1013.087133] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1013.087314] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1013.130547] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance ef011071-c0e1-44e0-9940-285f2f45da67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1013.130733] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1013.130879] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1013.131066] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1013.131207] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1013.183019] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d721a581-0b39-44b2-a83f-478c36900ba3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.187918] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3d7a3f-aa0b-41f6-ba9a-e94e9f8e579a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.216881] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95caa1c-cb15-4c8a-a7d9-2216d8862fb6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.223649] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0927af8-1b3f-4665-8e59-d7a7313667a0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.236811] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1013.244429] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1013.256631] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1013.256922] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1017.259562] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1017.259922] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1017.259922] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1017.272085] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1017.272242] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1017.272374] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1017.272498] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1017.272950] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1031.490280] env[60044]: DEBUG nova.compute.manager [req-5b4cf3cd-66d9-4ab9-ba77-6a8fffed910d req-f0c69185-a19d-4634-91e4-547a0579e30d service nova] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Received event network-vif-deleted-fc13b8a8-96f7-4ddd-9766-6bcd2fc2df1b {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1053.992033] env[60044]: WARNING oslo_vmware.rw_handles [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1053.992033] env[60044]: ERROR oslo_vmware.rw_handles [ 1053.992802] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1053.994519] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1053.994804] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Copying Virtual Disk [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/4c9439a5-bde1-488f-8ff2-9d1b01bc68d5/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1053.995105] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ca89779f-b983-4d6d-a0fe-0ce44f8b765b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.002906] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 1054.002906] env[60044]: value = "task-2204781" [ 1054.002906] env[60044]: _type = "Task" [ 1054.002906] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1054.010577] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204781, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1054.513287] env[60044]: DEBUG oslo_vmware.exceptions [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1054.514086] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1054.514406] env[60044]: ERROR nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1054.514406] env[60044]: Faults: ['InvalidArgument'] [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Traceback (most recent call last): [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] yield resources [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self.driver.spawn(context, instance, image_meta, [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self._fetch_image_if_missing(context, vi) [ 1054.514406] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] image_cache(vi, tmp_image_ds_loc) [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] vm_util.copy_virtual_disk( [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] session._wait_for_task(vmdk_copy_task) [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return self.wait_for_task(task_ref) [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return evt.wait() [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] result = hub.switch() [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1054.514934] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return self.greenlet.switch() [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self.f(*self.args, **self.kw) [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] raise exceptions.translate_fault(task_info.error) [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Faults: ['InvalidArgument'] [ 1054.515308] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] [ 1054.515308] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Terminating instance [ 1054.516286] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1054.516487] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1054.516715] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-021b57e8-623f-4c4e-9853-20e96e96ed05 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.519037] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1054.519232] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1054.519924] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d74e9ffb-cb78-4e98-966d-ecafa41a26e7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.526605] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1054.526807] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-95b66e78-1b68-42a5-8ada-c8aefba36f35 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.528907] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1054.529086] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1054.530017] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e952e104-4380-4e38-98b8-24a6b51c8c8f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.534913] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Waiting for the task: (returnval){ [ 1054.534913] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5288baf6-a1d3-1a38-209e-fcbc32b985e2" [ 1054.534913] env[60044]: _type = "Task" [ 1054.534913] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1054.544999] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5288baf6-a1d3-1a38-209e-fcbc32b985e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1054.599811] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1054.600043] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1054.600226] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleting the datastore file [datastore2] ef011071-c0e1-44e0-9940-285f2f45da67 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1054.600482] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fd10e60a-dd77-4e0f-b8c0-6a5d7b136183 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.607741] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for the task: (returnval){ [ 1054.607741] env[60044]: value = "task-2204783" [ 1054.607741] env[60044]: _type = "Task" [ 1054.607741] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1054.615311] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204783, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1055.046147] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1055.046513] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Creating directory with path [datastore2] vmware_temp/53ed6eb5-7725-461d-a72e-6ffa65bec40d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1055.046616] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb3a3a49-ecc7-45c7-997c-c7f9e10e2d81 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.057765] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Created directory with path [datastore2] vmware_temp/53ed6eb5-7725-461d-a72e-6ffa65bec40d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1055.057952] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Fetch image to [datastore2] vmware_temp/53ed6eb5-7725-461d-a72e-6ffa65bec40d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1055.058133] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/53ed6eb5-7725-461d-a72e-6ffa65bec40d/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1055.058862] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92e940a2-43c8-4b2b-9572-1b547032269c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.065168] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c0aa0a4-1673-4240-9e8f-4d493de6759d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.073842] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2166f812-7009-4b9c-8843-754efa087272 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.104936] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-affca100-93d4-47d0-a334-ca4a1738460f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.113158] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f7d38553-f225-4942-b5e7-7e29c59520e4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.117345] env[60044]: DEBUG oslo_vmware.api [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Task: {'id': task-2204783, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06625} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1055.117911] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1055.118124] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1055.118296] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1055.118466] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1055.120625] env[60044]: DEBUG nova.compute.claims [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1055.120787] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.121014] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.135934] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1055.197211] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43157625-744e-45c2-9a8d-4a2530a40ca5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.204546] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-527141fb-b177-48b2-b35c-ce9b0e3ce046 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.235007] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a05038fa-c7ea-457f-926a-043aa4da4f39 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.242117] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de14b8b1-97ea-41a8-b156-a8f235813e87 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.255060] env[60044]: DEBUG nova.compute.provider_tree [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1055.263386] env[60044]: DEBUG nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1055.275826] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.155s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.276304] env[60044]: ERROR nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1055.276304] env[60044]: Faults: ['InvalidArgument'] [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Traceback (most recent call last): [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self.driver.spawn(context, instance, image_meta, [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self._fetch_image_if_missing(context, vi) [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] image_cache(vi, tmp_image_ds_loc) [ 1055.276304] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] vm_util.copy_virtual_disk( [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] session._wait_for_task(vmdk_copy_task) [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return self.wait_for_task(task_ref) [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return evt.wait() [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] result = hub.switch() [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] return self.greenlet.switch() [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1055.276700] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] self.f(*self.args, **self.kw) [ 1055.277088] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1055.277088] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] raise exceptions.translate_fault(task_info.error) [ 1055.277088] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1055.277088] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Faults: ['InvalidArgument'] [ 1055.277088] env[60044]: ERROR nova.compute.manager [instance: ef011071-c0e1-44e0-9940-285f2f45da67] [ 1055.277088] env[60044]: DEBUG nova.compute.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1055.278337] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Build of instance ef011071-c0e1-44e0-9940-285f2f45da67 was re-scheduled: A specified parameter was not correct: fileType [ 1055.278337] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1055.278694] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1055.278860] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1055.279034] env[60044]: DEBUG nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1055.279232] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1055.355834] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1055.357544] env[60044]: ERROR nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] result = getattr(controller, method)(*args, **kwargs) [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._get(image_id) [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1055.357544] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] resp, body = self.http_client.get(url, headers=header) [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.request(url, 'GET', **kwargs) [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._handle_response(resp) [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exc.from_response(resp, resp.content) [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] During handling of the above exception, another exception occurred: [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1055.357895] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] yield resources [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.driver.spawn(context, instance, image_meta, [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._fetch_image_if_missing(context, vi) [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image_fetch(context, vi, tmp_image_ds_loc) [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] images.fetch_image( [ 1055.358277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] metadata = IMAGE_API.get(context, image_ref) [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return session.show(context, image_id, [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] _reraise_translated_image_exception(image_id) [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise new_exc.with_traceback(exc_trace) [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] result = getattr(controller, method)(*args, **kwargs) [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1055.358668] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._get(image_id) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] resp, body = self.http_client.get(url, headers=header) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.request(url, 'GET', **kwargs) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._handle_response(resp) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exc.from_response(resp, resp.content) [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1055.359071] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1055.359616] env[60044]: INFO nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Terminating instance [ 1055.359616] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1055.359718] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1055.360280] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1055.360477] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1055.360731] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48128edd-fe63-49b0-8745-8159357b6197 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.363302] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29150ff8-0739-4a46-b57f-d87c8d76474c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.371787] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1055.371985] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-33f454bf-08cd-4471-90a5-64d520f93bc8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.374166] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1055.374331] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1055.375283] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4c9c20d-2033-4251-8089-fb71faa0cf14 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.379832] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Waiting for the task: (returnval){ [ 1055.379832] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5251c7f1-d9ad-619f-f3ee-1b7ba50882cf" [ 1055.379832] env[60044]: _type = "Task" [ 1055.379832] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1055.386763] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5251c7f1-d9ad-619f-f3ee-1b7ba50882cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1055.445175] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1055.445395] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1055.445593] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Deleting the datastore file [datastore2] 6874067b-8e9b-4242-9a5f-6312f1484a00 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1055.445852] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1cb3025b-b82a-42f6-9907-a0b26b2ae90d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.451491] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Waiting for the task: (returnval){ [ 1055.451491] env[60044]: value = "task-2204785" [ 1055.451491] env[60044]: _type = "Task" [ 1055.451491] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1055.458793] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Task: {'id': task-2204785, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1055.616249] env[60044]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1055.633123] env[60044]: INFO nova.compute.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Took 0.35 seconds to deallocate network for instance. [ 1055.719778] env[60044]: INFO nova.scheduler.client.report [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Deleted allocations for instance ef011071-c0e1-44e0-9940-285f2f45da67 [ 1055.737053] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 521.918s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.737181] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 322.694s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.737467] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.737714] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.737925] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.740052] env[60044]: INFO nova.compute.manager [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Terminating instance [ 1055.741674] env[60044]: DEBUG nova.compute.manager [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1055.741889] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1055.742387] env[60044]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f4b9220d-621d-42b0-8a43-555504c5392e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.751928] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1994f44d-0922-4b62-8ed3-00848708b229 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.780263] env[60044]: WARNING nova.virt.vmwareapi.vmops [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ef011071-c0e1-44e0-9940-285f2f45da67 could not be found. [ 1055.780474] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1055.780686] env[60044]: INFO nova.compute.manager [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1055.780961] env[60044]: DEBUG oslo.service.loopingcall [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1055.781214] env[60044]: DEBUG nova.compute.manager [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1055.781317] env[60044]: DEBUG nova.network.neutron [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1055.805295] env[60044]: DEBUG nova.network.neutron [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1055.813191] env[60044]: INFO nova.compute.manager [-] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Took 0.03 seconds to deallocate network for instance. [ 1055.892074] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1055.892074] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Creating directory with path [datastore2] vmware_temp/09b8d312-ebd9-4648-b1f9-096a28a07424/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1055.893438] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-186f08ea-33f8-441f-96f3-e0875fcb2aea {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.895746] env[60044]: DEBUG oslo_concurrency.lockutils [None req-eb30c12c-594e-480a-aeed-7a14cf528a9c tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "ef011071-c0e1-44e0-9940-285f2f45da67" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.906851] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Created directory with path [datastore2] vmware_temp/09b8d312-ebd9-4648-b1f9-096a28a07424/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1055.907095] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Fetch image to [datastore2] vmware_temp/09b8d312-ebd9-4648-b1f9-096a28a07424/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1055.907303] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/09b8d312-ebd9-4648-b1f9-096a28a07424/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1055.908091] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfedf0e1-5197-4abf-b0f2-1f3c3ef0af59 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.915165] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f615a67d-5a56-4998-9deb-d3318d6a9edb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.925893] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b673688e-6684-4a6f-a6e5-7186f5558367 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.958480] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a84534-74fc-4959-933e-db80e2896b4d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.966692] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f420f289-9c7d-43b6-b522-bfa5bcb8d9bd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.968283] env[60044]: DEBUG oslo_vmware.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Task: {'id': task-2204785, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07213} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1055.968516] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1055.968688] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1055.968855] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1055.969030] env[60044]: INFO nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1055.970948] env[60044]: DEBUG nova.compute.claims [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1055.971123] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.971326] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.987283] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1055.994502] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.995168] env[60044]: DEBUG nova.compute.utils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1055.996506] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1055.996688] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1055.996852] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1055.997020] env[60044]: DEBUG nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1055.997180] env[60044]: DEBUG nova.network.neutron [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1056.021525] env[60044]: DEBUG neutronclient.v2_0.client [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1056.025175] env[60044]: ERROR nova.compute.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] result = getattr(controller, method)(*args, **kwargs) [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._get(image_id) [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.025175] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] resp, body = self.http_client.get(url, headers=header) [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.request(url, 'GET', **kwargs) [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._handle_response(resp) [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exc.from_response(resp, resp.content) [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] During handling of the above exception, another exception occurred: [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.025534] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.driver.spawn(context, instance, image_meta, [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._fetch_image_if_missing(context, vi) [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image_fetch(context, vi, tmp_image_ds_loc) [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] images.fetch_image( [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] metadata = IMAGE_API.get(context, image_ref) [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1056.025884] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return session.show(context, image_id, [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] _reraise_translated_image_exception(image_id) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise new_exc.with_traceback(exc_trace) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] result = getattr(controller, method)(*args, **kwargs) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._get(image_id) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.026242] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] resp, body = self.http_client.get(url, headers=header) [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.request(url, 'GET', **kwargs) [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._handle_response(resp) [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exc.from_response(resp, resp.content) [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] During handling of the above exception, another exception occurred: [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.026552] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._build_and_run_instance(context, instance, image, [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] with excutils.save_and_reraise_exception(): [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.force_reraise() [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise self.value [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] with self.rt.instance_claim(context, instance, node, allocs, [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.abort() [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1056.026908] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return f(*args, **kwargs) [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._unset_instance_host_and_node(instance) [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] instance.save() [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] updates, result = self.indirection_api.object_action( [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return cctxt.call(context, 'object_action', objinst=objinst, [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1056.027277] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] result = self.transport._send( [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._driver.send(target, ctxt, message, [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise result [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] nova.exception_Remote.InstanceNotFound_Remote: Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 could not be found. [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return getattr(target, method)(*args, **kwargs) [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027641] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return fn(self, *args, **kwargs) [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] old_ref, inst_ref = db.instance_update_and_get_original( [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return f(*args, **kwargs) [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] with excutils.save_and_reraise_exception() as ectxt: [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.force_reraise() [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.027997] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise self.value [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return f(*args, **kwargs) [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return f(context, *args, **kwargs) [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exception.InstanceNotFound(instance_id=uuid) [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028422] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] nova.exception.InstanceNotFound: Instance 6874067b-8e9b-4242-9a5f-6312f1484a00 could not be found. [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] During handling of the above exception, another exception occurred: [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] exception_handler_v20(status_code, error_body) [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise client_exc(message=error_message, [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Neutron server returns request_ids: ['req-bc31a095-cc3b-4d76-a67b-1d32fc5284fa'] [ 1056.028881] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] During handling of the above exception, another exception occurred: [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Traceback (most recent call last): [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._deallocate_network(context, instance, requested_networks) [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self.network_api.deallocate_for_instance( [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] data = neutron.list_ports(**search_opts) [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1056.029306] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.list('ports', self.ports_path, retrieve_all, [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] for r in self._pagination(collection, path, **params): [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] res = self.get(path, params=params) [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.retry_request("GET", action, body=body, [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.029727] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] return self.do_request(method, action, body=body, [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] ret = obj(*args, **kwargs) [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] self._handle_fault_response(status_code, replybody, resp) [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] raise exception.Unauthorized() [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] nova.exception.Unauthorized: Not authorized. [ 1056.030120] env[60044]: ERROR nova.compute.manager [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] [ 1056.049965] env[60044]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "6874067b-8e9b-4242-9a5f-6312f1484a00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 459.686s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1056.087676] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1056.088518] env[60044]: ERROR nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] result = getattr(controller, method)(*args, **kwargs) [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._get(image_id) [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.088518] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] resp, body = self.http_client.get(url, headers=header) [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.request(url, 'GET', **kwargs) [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._handle_response(resp) [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exc.from_response(resp, resp.content) [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] During handling of the above exception, another exception occurred: [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.088807] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] yield resources [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.driver.spawn(context, instance, image_meta, [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._fetch_image_if_missing(context, vi) [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image_fetch(context, vi, tmp_image_ds_loc) [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] images.fetch_image( [ 1056.089109] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] metadata = IMAGE_API.get(context, image_ref) [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return session.show(context, image_id, [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] _reraise_translated_image_exception(image_id) [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise new_exc.with_traceback(exc_trace) [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] result = getattr(controller, method)(*args, **kwargs) [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.089436] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._get(image_id) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] resp, body = self.http_client.get(url, headers=header) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.request(url, 'GET', **kwargs) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._handle_response(resp) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exc.from_response(resp, resp.content) [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.089789] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.090078] env[60044]: INFO nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Terminating instance [ 1056.090604] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1056.090892] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1056.091491] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1056.091717] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1056.091971] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94b5eeda-2909-4f70-b570-1896eb5c958c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.095352] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eef1466b-5dfe-4fdb-a66c-3e964892e2c9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.102408] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1056.102655] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-062350f7-ca59-4f7d-ac8b-e82769669c3e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.104767] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1056.104974] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1056.106249] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-62aaab71-208a-4300-9be8-42129303bdba {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.110922] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Waiting for the task: (returnval){ [ 1056.110922] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]528141f3-6819-5aef-9380-082ecfe37419" [ 1056.110922] env[60044]: _type = "Task" [ 1056.110922] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1056.118139] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]528141f3-6819-5aef-9380-082ecfe37419, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1056.164028] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1056.164267] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1056.164444] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Deleting the datastore file [datastore2] f03f507b-364f-41b9-ad33-dcb56ab03317 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1056.164744] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70d70e91-5611-4cbe-b2e0-d7e74144bb47 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.171657] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Waiting for the task: (returnval){ [ 1056.171657] env[60044]: value = "task-2204787" [ 1056.171657] env[60044]: _type = "Task" [ 1056.171657] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1056.181067] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Task: {'id': task-2204787, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1056.621040] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1056.621366] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Creating directory with path [datastore2] vmware_temp/fe119625-f9c4-47c2-9ca1-279b7e2b7f74/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1056.621629] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-92ec4e93-1a8e-4702-b515-c6296b7cca2c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.633195] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Created directory with path [datastore2] vmware_temp/fe119625-f9c4-47c2-9ca1-279b7e2b7f74/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1056.633417] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Fetch image to [datastore2] vmware_temp/fe119625-f9c4-47c2-9ca1-279b7e2b7f74/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1056.633622] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/fe119625-f9c4-47c2-9ca1-279b7e2b7f74/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1056.634367] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97f2dfc1-0b9b-4a8f-b051-0044e99fd4ab {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.640898] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55a7d83b-c6db-455a-83af-e3e2873a4041 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.649496] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff760499-63ed-4611-9af4-203ac800b65b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.683019] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc980dd-9a6a-48b4-a035-0d5b7b2e09cc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.690766] env[60044]: DEBUG oslo_vmware.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Task: {'id': task-2204787, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074351} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1056.692245] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1056.692434] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1056.692603] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1056.692771] env[60044]: INFO nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1056.694569] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b0983d21-9595-432a-ae89-d5c996c852b3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.696637] env[60044]: DEBUG nova.compute.claims [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1056.696830] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1056.697056] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1056.720919] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1056.724737] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1056.725459] env[60044]: DEBUG nova.compute.utils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance f03f507b-364f-41b9-ad33-dcb56ab03317 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1056.727135] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1056.727305] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1056.727467] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1056.727710] env[60044]: DEBUG nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1056.728009] env[60044]: DEBUG nova.network.neutron [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1056.754147] env[60044]: DEBUG neutronclient.v2_0.client [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1056.755760] env[60044]: ERROR nova.compute.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] result = getattr(controller, method)(*args, **kwargs) [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._get(image_id) [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.755760] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] resp, body = self.http_client.get(url, headers=header) [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.request(url, 'GET', **kwargs) [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._handle_response(resp) [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exc.from_response(resp, resp.content) [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] During handling of the above exception, another exception occurred: [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.756144] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.driver.spawn(context, instance, image_meta, [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._fetch_image_if_missing(context, vi) [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image_fetch(context, vi, tmp_image_ds_loc) [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] images.fetch_image( [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] metadata = IMAGE_API.get(context, image_ref) [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1056.756469] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return session.show(context, image_id, [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] _reraise_translated_image_exception(image_id) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise new_exc.with_traceback(exc_trace) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] result = getattr(controller, method)(*args, **kwargs) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._get(image_id) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.756857] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] resp, body = self.http_client.get(url, headers=header) [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.request(url, 'GET', **kwargs) [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._handle_response(resp) [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exc.from_response(resp, resp.content) [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] During handling of the above exception, another exception occurred: [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.757193] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._build_and_run_instance(context, instance, image, [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] with excutils.save_and_reraise_exception(): [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.force_reraise() [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise self.value [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] with self.rt.instance_claim(context, instance, node, allocs, [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.abort() [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1056.757478] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return f(*args, **kwargs) [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._unset_instance_host_and_node(instance) [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] instance.save() [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] updates, result = self.indirection_api.object_action( [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return cctxt.call(context, 'object_action', objinst=objinst, [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1056.757810] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] result = self.transport._send( [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._driver.send(target, ctxt, message, [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise result [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] nova.exception_Remote.InstanceNotFound_Remote: Instance f03f507b-364f-41b9-ad33-dcb56ab03317 could not be found. [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return getattr(target, method)(*args, **kwargs) [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758160] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return fn(self, *args, **kwargs) [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] old_ref, inst_ref = db.instance_update_and_get_original( [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return f(*args, **kwargs) [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] with excutils.save_and_reraise_exception() as ectxt: [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.force_reraise() [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758533] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise self.value [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return f(*args, **kwargs) [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return f(context, *args, **kwargs) [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exception.InstanceNotFound(instance_id=uuid) [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.758941] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] nova.exception.InstanceNotFound: Instance f03f507b-364f-41b9-ad33-dcb56ab03317 could not be found. [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] During handling of the above exception, another exception occurred: [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] exception_handler_v20(status_code, error_body) [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise client_exc(message=error_message, [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Neutron server returns request_ids: ['req-28a700d2-6eee-478e-9ee8-fca9a5a8dcdb'] [ 1056.759358] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] During handling of the above exception, another exception occurred: [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Traceback (most recent call last): [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._deallocate_network(context, instance, requested_networks) [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self.network_api.deallocate_for_instance( [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] data = neutron.list_ports(**search_opts) [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1056.759799] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.list('ports', self.ports_path, retrieve_all, [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] for r in self._pagination(collection, path, **params): [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] res = self.get(path, params=params) [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.retry_request("GET", action, body=body, [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.760145] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] return self.do_request(method, action, body=body, [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] ret = obj(*args, **kwargs) [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] self._handle_fault_response(status_code, replybody, resp) [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] raise exception.Unauthorized() [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] nova.exception.Unauthorized: Not authorized. [ 1056.760556] env[60044]: ERROR nova.compute.manager [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] [ 1056.778412] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "f03f507b-364f-41b9-ad33-dcb56ab03317" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 452.562s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1056.821358] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1056.822148] env[60044]: ERROR nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] result = getattr(controller, method)(*args, **kwargs) [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._get(image_id) [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.822148] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] resp, body = self.http_client.get(url, headers=header) [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.request(url, 'GET', **kwargs) [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._handle_response(resp) [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exc.from_response(resp, resp.content) [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] During handling of the above exception, another exception occurred: [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1056.822564] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] yield resources [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.driver.spawn(context, instance, image_meta, [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._fetch_image_if_missing(context, vi) [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image_fetch(context, vi, tmp_image_ds_loc) [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] images.fetch_image( [ 1056.822899] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] metadata = IMAGE_API.get(context, image_ref) [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return session.show(context, image_id, [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] _reraise_translated_image_exception(image_id) [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise new_exc.with_traceback(exc_trace) [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] result = getattr(controller, method)(*args, **kwargs) [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1056.823281] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._get(image_id) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] resp, body = self.http_client.get(url, headers=header) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.request(url, 'GET', **kwargs) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._handle_response(resp) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exc.from_response(resp, resp.content) [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1056.823708] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1056.824051] env[60044]: INFO nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Terminating instance [ 1056.824094] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1056.824307] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1056.824936] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1056.825901] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1056.825901] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2258700b-d614-42e5-bda0-9827ae695042 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.828381] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc0f711-0b75-46a3-bfab-994348df86a8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.835875] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1056.835946] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f4c79d33-983b-4f1b-b5d9-c596f77319a0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.838403] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1056.838576] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1056.839559] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9aa67eb7-a811-40db-ab75-1e486264269a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.846132] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Waiting for the task: (returnval){ [ 1056.846132] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52e3f9d7-f03e-22ab-4ec5-01c12ab0132f" [ 1056.846132] env[60044]: _type = "Task" [ 1056.846132] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1056.855094] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52e3f9d7-f03e-22ab-4ec5-01c12ab0132f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1056.895640] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1056.895640] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1056.895640] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Deleting the datastore file [datastore2] ea4a243b-481f-421d-ba29-c88c828f754e {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1056.896402] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b67d6200-b3ef-422f-a736-2c41c1f667aa {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.902282] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Waiting for the task: (returnval){ [ 1056.902282] env[60044]: value = "task-2204789" [ 1056.902282] env[60044]: _type = "Task" [ 1056.902282] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1056.909830] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Task: {'id': task-2204789, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.356054] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1057.356374] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Creating directory with path [datastore2] vmware_temp/b10c2581-bae3-4985-8b0d-15d2d91676ec/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1057.356517] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7afdff8d-045d-4699-b2db-d561b317b2f5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.367331] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Created directory with path [datastore2] vmware_temp/b10c2581-bae3-4985-8b0d-15d2d91676ec/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1057.367510] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Fetch image to [datastore2] vmware_temp/b10c2581-bae3-4985-8b0d-15d2d91676ec/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1057.367675] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/b10c2581-bae3-4985-8b0d-15d2d91676ec/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1057.368357] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65c26fd3-aec3-413c-ae5f-03af82585523 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.374745] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b16911b-a0e0-455a-a0f8-1abaddbe421e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.383558] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-448474a0-1f7b-4e65-a1af-81c19704734c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.416132] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b42f01a-9bb9-4513-ad6a-eae7d01f148d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.422558] env[60044]: DEBUG oslo_vmware.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Task: {'id': task-2204789, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072078} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1057.423926] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1057.424124] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1057.424296] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1057.424463] env[60044]: INFO nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1057.426205] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dfa6e0e7-5670-43bd-84ea-d3fc85984645 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.428050] env[60044]: DEBUG nova.compute.claims [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1057.428232] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.428468] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.450229] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1057.454483] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.455119] env[60044]: DEBUG nova.compute.utils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance ea4a243b-481f-421d-ba29-c88c828f754e could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1057.456725] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1057.456876] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1057.457047] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1057.457310] env[60044]: DEBUG nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1057.457357] env[60044]: DEBUG nova.network.neutron [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1057.481671] env[60044]: DEBUG neutronclient.v2_0.client [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1057.483165] env[60044]: ERROR nova.compute.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] result = getattr(controller, method)(*args, **kwargs) [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._get(image_id) [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1057.483165] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] resp, body = self.http_client.get(url, headers=header) [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.request(url, 'GET', **kwargs) [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._handle_response(resp) [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exc.from_response(resp, resp.content) [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] During handling of the above exception, another exception occurred: [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.483542] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.driver.spawn(context, instance, image_meta, [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._fetch_image_if_missing(context, vi) [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image_fetch(context, vi, tmp_image_ds_loc) [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] images.fetch_image( [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] metadata = IMAGE_API.get(context, image_ref) [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1057.483869] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return session.show(context, image_id, [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] _reraise_translated_image_exception(image_id) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise new_exc.with_traceback(exc_trace) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] result = getattr(controller, method)(*args, **kwargs) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._get(image_id) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1057.484236] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] resp, body = self.http_client.get(url, headers=header) [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.request(url, 'GET', **kwargs) [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._handle_response(resp) [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exc.from_response(resp, resp.content) [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] During handling of the above exception, another exception occurred: [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.484587] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._build_and_run_instance(context, instance, image, [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] with excutils.save_and_reraise_exception(): [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.force_reraise() [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise self.value [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] with self.rt.instance_claim(context, instance, node, allocs, [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.abort() [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1057.484932] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return f(*args, **kwargs) [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._unset_instance_host_and_node(instance) [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] instance.save() [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] updates, result = self.indirection_api.object_action( [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return cctxt.call(context, 'object_action', objinst=objinst, [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1057.485307] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] result = self.transport._send( [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._driver.send(target, ctxt, message, [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise result [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] nova.exception_Remote.InstanceNotFound_Remote: Instance ea4a243b-481f-421d-ba29-c88c828f754e could not be found. [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return getattr(target, method)(*args, **kwargs) [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.485673] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return fn(self, *args, **kwargs) [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] old_ref, inst_ref = db.instance_update_and_get_original( [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return f(*args, **kwargs) [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] with excutils.save_and_reraise_exception() as ectxt: [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.force_reraise() [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486081] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise self.value [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return f(*args, **kwargs) [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return f(context, *args, **kwargs) [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exception.InstanceNotFound(instance_id=uuid) [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486487] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] nova.exception.InstanceNotFound: Instance ea4a243b-481f-421d-ba29-c88c828f754e could not be found. [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] During handling of the above exception, another exception occurred: [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] exception_handler_v20(status_code, error_body) [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise client_exc(message=error_message, [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Neutron server returns request_ids: ['req-bdc48ada-df2a-4049-b968-1f674e1773ba'] [ 1057.486933] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] During handling of the above exception, another exception occurred: [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Traceback (most recent call last): [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._deallocate_network(context, instance, requested_networks) [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self.network_api.deallocate_for_instance( [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] data = neutron.list_ports(**search_opts) [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1057.487334] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.list('ports', self.ports_path, retrieve_all, [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] for r in self._pagination(collection, path, **params): [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] res = self.get(path, params=params) [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.retry_request("GET", action, body=body, [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.487715] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] return self.do_request(method, action, body=body, [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] ret = obj(*args, **kwargs) [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] self._handle_fault_response(status_code, replybody, resp) [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] raise exception.Unauthorized() [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] nova.exception.Unauthorized: Not authorized. [ 1057.488082] env[60044]: ERROR nova.compute.manager [instance: ea4a243b-481f-421d-ba29-c88c828f754e] [ 1057.505424] env[60044]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "ea4a243b-481f-421d-ba29-c88c828f754e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 451.856s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.545389] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1057.546229] env[60044]: ERROR nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] result = getattr(controller, method)(*args, **kwargs) [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._get(image_id) [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1057.546229] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] resp, body = self.http_client.get(url, headers=header) [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.request(url, 'GET', **kwargs) [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._handle_response(resp) [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exc.from_response(resp, resp.content) [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] During handling of the above exception, another exception occurred: [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1057.546558] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] yield resources [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.driver.spawn(context, instance, image_meta, [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._fetch_image_if_missing(context, vi) [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image_fetch(context, vi, tmp_image_ds_loc) [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] images.fetch_image( [ 1057.546890] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] metadata = IMAGE_API.get(context, image_ref) [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return session.show(context, image_id, [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] _reraise_translated_image_exception(image_id) [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise new_exc.with_traceback(exc_trace) [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] result = getattr(controller, method)(*args, **kwargs) [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1057.547264] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._get(image_id) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] resp, body = self.http_client.get(url, headers=header) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.request(url, 'GET', **kwargs) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._handle_response(resp) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exc.from_response(resp, resp.content) [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1057.547659] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1057.547983] env[60044]: INFO nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Terminating instance [ 1057.548071] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1057.548284] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1057.548877] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1057.549071] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1057.549295] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6db968ac-8e48-438b-9b13-252fe59cea8d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.554264] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-050b7a9d-ba1b-468b-a8cd-9356bd328a4e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.561106] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1057.561521] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6eb89d9a-8338-4d83-b0b9-2b4ba89157b3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.563972] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1057.564157] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1057.565083] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f36d30a0-7265-4aa2-bcc3-8c3f38d05ebc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.569771] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Waiting for the task: (returnval){ [ 1057.569771] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]525cbe9f-fe96-38ac-18fb-8b8946caa8cc" [ 1057.569771] env[60044]: _type = "Task" [ 1057.569771] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.577397] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]525cbe9f-fe96-38ac-18fb-8b8946caa8cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.629544] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1057.629745] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1057.629966] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Deleting the datastore file [datastore2] df997589-61b6-4f68-9169-e6f9bee650c7 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1057.630232] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-809410f7-bef7-45f8-972d-d389f76b4c0f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.636788] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Waiting for the task: (returnval){ [ 1057.636788] env[60044]: value = "task-2204791" [ 1057.636788] env[60044]: _type = "Task" [ 1057.636788] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.644624] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Task: {'id': task-2204791, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1058.079684] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1058.079952] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Creating directory with path [datastore2] vmware_temp/d6952405-c23f-449a-aefb-6e948846352b/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.080191] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5c5c335-54da-467c-8fd8-7bfdd225f821 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.091402] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Created directory with path [datastore2] vmware_temp/d6952405-c23f-449a-aefb-6e948846352b/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.091586] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Fetch image to [datastore2] vmware_temp/d6952405-c23f-449a-aefb-6e948846352b/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1058.091752] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/d6952405-c23f-449a-aefb-6e948846352b/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1058.092487] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8db8a78-e427-4d81-8340-d066d2655417 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.098984] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4235a4c9-5032-40f1-a1b7-62fa733dad31 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.107698] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e239f19b-e5ad-4c55-96eb-1db3c5a35146 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.137462] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fbbee98-1052-46ab-9dcc-e51d2d319e64 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.148795] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61802725-b67b-4065-8c79-34556ff74d24 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.150398] env[60044]: DEBUG oslo_vmware.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Task: {'id': task-2204791, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073514} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1058.150626] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.150798] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.150960] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.151142] env[60044]: INFO nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1058.153231] env[60044]: DEBUG nova.compute.claims [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1058.153418] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1058.153642] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1058.171720] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1058.178415] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1058.179066] env[60044]: DEBUG nova.compute.utils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance df997589-61b6-4f68-9169-e6f9bee650c7 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1058.180390] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1058.180555] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1058.180711] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1058.180876] env[60044]: DEBUG nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1058.181045] env[60044]: DEBUG nova.network.neutron [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1058.206577] env[60044]: DEBUG neutronclient.v2_0.client [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1058.208126] env[60044]: ERROR nova.compute.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] result = getattr(controller, method)(*args, **kwargs) [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._get(image_id) [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1058.208126] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] resp, body = self.http_client.get(url, headers=header) [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.request(url, 'GET', **kwargs) [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._handle_response(resp) [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exc.from_response(resp, resp.content) [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] During handling of the above exception, another exception occurred: [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.208481] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.driver.spawn(context, instance, image_meta, [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._fetch_image_if_missing(context, vi) [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image_fetch(context, vi, tmp_image_ds_loc) [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] images.fetch_image( [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] metadata = IMAGE_API.get(context, image_ref) [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1058.208847] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return session.show(context, image_id, [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] _reraise_translated_image_exception(image_id) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise new_exc.with_traceback(exc_trace) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] result = getattr(controller, method)(*args, **kwargs) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._get(image_id) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1058.209205] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] resp, body = self.http_client.get(url, headers=header) [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.request(url, 'GET', **kwargs) [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._handle_response(resp) [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exc.from_response(resp, resp.content) [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] During handling of the above exception, another exception occurred: [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.209556] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._build_and_run_instance(context, instance, image, [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] with excutils.save_and_reraise_exception(): [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.force_reraise() [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise self.value [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] with self.rt.instance_claim(context, instance, node, allocs, [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.abort() [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1058.209891] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return f(*args, **kwargs) [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._unset_instance_host_and_node(instance) [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] instance.save() [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] updates, result = self.indirection_api.object_action( [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return cctxt.call(context, 'object_action', objinst=objinst, [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1058.210269] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] result = self.transport._send( [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._driver.send(target, ctxt, message, [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise result [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] nova.exception_Remote.InstanceNotFound_Remote: Instance df997589-61b6-4f68-9169-e6f9bee650c7 could not be found. [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return getattr(target, method)(*args, **kwargs) [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210600] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return fn(self, *args, **kwargs) [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] old_ref, inst_ref = db.instance_update_and_get_original( [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return f(*args, **kwargs) [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] with excutils.save_and_reraise_exception() as ectxt: [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.force_reraise() [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.210956] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise self.value [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return f(*args, **kwargs) [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return f(context, *args, **kwargs) [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exception.InstanceNotFound(instance_id=uuid) [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211375] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] nova.exception.InstanceNotFound: Instance df997589-61b6-4f68-9169-e6f9bee650c7 could not be found. [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] During handling of the above exception, another exception occurred: [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] exception_handler_v20(status_code, error_body) [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise client_exc(message=error_message, [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Neutron server returns request_ids: ['req-e9612d85-1231-445a-b290-583758f53252'] [ 1058.211780] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] During handling of the above exception, another exception occurred: [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Traceback (most recent call last): [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._deallocate_network(context, instance, requested_networks) [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self.network_api.deallocate_for_instance( [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] data = neutron.list_ports(**search_opts) [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1058.212173] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.list('ports', self.ports_path, retrieve_all, [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] for r in self._pagination(collection, path, **params): [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] res = self.get(path, params=params) [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.retry_request("GET", action, body=body, [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.212515] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] return self.do_request(method, action, body=body, [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] ret = obj(*args, **kwargs) [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] self._handle_fault_response(status_code, replybody, resp) [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] raise exception.Unauthorized() [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] nova.exception.Unauthorized: Not authorized. [ 1058.212863] env[60044]: ERROR nova.compute.manager [instance: df997589-61b6-4f68-9169-e6f9bee650c7] [ 1058.229929] env[60044]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "df997589-61b6-4f68-9169-e6f9bee650c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 452.509s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1058.268784] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1058.269543] env[60044]: ERROR nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] result = getattr(controller, method)(*args, **kwargs) [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._get(image_id) [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1058.269543] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] resp, body = self.http_client.get(url, headers=header) [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.request(url, 'GET', **kwargs) [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._handle_response(resp) [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exc.from_response(resp, resp.content) [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] During handling of the above exception, another exception occurred: [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1058.269910] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] yield resources [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.driver.spawn(context, instance, image_meta, [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._fetch_image_if_missing(context, vi) [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image_fetch(context, vi, tmp_image_ds_loc) [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] images.fetch_image( [ 1058.270259] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] metadata = IMAGE_API.get(context, image_ref) [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return session.show(context, image_id, [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] _reraise_translated_image_exception(image_id) [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise new_exc.with_traceback(exc_trace) [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] result = getattr(controller, method)(*args, **kwargs) [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1058.270634] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._get(image_id) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] resp, body = self.http_client.get(url, headers=header) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.request(url, 'GET', **kwargs) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._handle_response(resp) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exc.from_response(resp, resp.content) [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1058.271034] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1058.271366] env[60044]: INFO nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Terminating instance [ 1058.271366] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1058.271762] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.271762] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e5e541cb-06b7-4e52-9aea-b0393934d32f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.274570] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1058.274758] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1058.275520] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3164b6b-70d2-446c-af51-7508a78fe2c0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.282790] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1058.282980] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-53d96c8d-3b4e-496a-b1fc-33f8fafccb07 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.285159] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.285329] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1058.286298] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-404d0807-c9fa-41ca-a347-845426648cea {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.291271] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 1058.291271] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]529bb2ef-69a7-706f-8afb-a07999b18f25" [ 1058.291271] env[60044]: _type = "Task" [ 1058.291271] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1058.298025] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]529bb2ef-69a7-706f-8afb-a07999b18f25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1058.341137] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1058.341371] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1058.341548] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Deleting the datastore file [datastore2] 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1058.341818] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9b8d2863-29d3-4d0d-8af5-66696249d98a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.348369] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Waiting for the task: (returnval){ [ 1058.348369] env[60044]: value = "task-2204793" [ 1058.348369] env[60044]: _type = "Task" [ 1058.348369] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1058.356258] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Task: {'id': task-2204793, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1058.800545] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1058.800787] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating directory with path [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.801036] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1352cc2c-5bc6-4310-849c-be928737186c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.811529] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created directory with path [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.811710] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Fetch image to [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1058.811868] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1058.812583] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d9be354-214e-44d2-acec-281cd4630dfa {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.818891] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae1e345e-0260-427b-95a2-d352678d5c34 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.828077] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2ce79fb-61b5-4506-ac4a-3bffca6b733a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.861275] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbccaf05-5af6-491f-b2f8-88b6fc5ab0f2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.867828] env[60044]: DEBUG oslo_vmware.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Task: {'id': task-2204793, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070104} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1058.869180] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.869371] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.869539] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.869707] env[60044]: INFO nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1058.871427] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4ce708d6-b0e0-4097-8c99-bac16a2d7bd5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.873221] env[60044]: DEBUG nova.compute.claims [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1058.873387] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1058.873605] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1058.897263] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1058.899939] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1058.900535] env[60044]: DEBUG nova.compute.utils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1058.901885] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1058.902086] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1058.902254] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1058.902419] env[60044]: DEBUG nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1058.902575] env[60044]: DEBUG nova.network.neutron [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1058.990181] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1059.045662] env[60044]: DEBUG neutronclient.v2_0.client [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1059.047236] env[60044]: ERROR nova.compute.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] result = getattr(controller, method)(*args, **kwargs) [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._get(image_id) [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1059.047236] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] resp, body = self.http_client.get(url, headers=header) [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.request(url, 'GET', **kwargs) [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._handle_response(resp) [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exc.from_response(resp, resp.content) [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] During handling of the above exception, another exception occurred: [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.047635] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.driver.spawn(context, instance, image_meta, [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._fetch_image_if_missing(context, vi) [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image_fetch(context, vi, tmp_image_ds_loc) [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] images.fetch_image( [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] metadata = IMAGE_API.get(context, image_ref) [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1059.047932] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return session.show(context, image_id, [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] _reraise_translated_image_exception(image_id) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise new_exc.with_traceback(exc_trace) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] result = getattr(controller, method)(*args, **kwargs) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._get(image_id) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1059.048268] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] resp, body = self.http_client.get(url, headers=header) [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.request(url, 'GET', **kwargs) [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._handle_response(resp) [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exc.from_response(resp, resp.content) [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] During handling of the above exception, another exception occurred: [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.048563] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._build_and_run_instance(context, instance, image, [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] with excutils.save_and_reraise_exception(): [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.force_reraise() [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise self.value [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] with self.rt.instance_claim(context, instance, node, allocs, [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.abort() [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1059.048854] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return f(*args, **kwargs) [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._unset_instance_host_and_node(instance) [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] instance.save() [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] updates, result = self.indirection_api.object_action( [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return cctxt.call(context, 'object_action', objinst=objinst, [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1059.049179] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] result = self.transport._send( [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._driver.send(target, ctxt, message, [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise result [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] nova.exception_Remote.InstanceNotFound_Remote: Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 could not be found. [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return getattr(target, method)(*args, **kwargs) [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049450] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return fn(self, *args, **kwargs) [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] old_ref, inst_ref = db.instance_update_and_get_original( [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return f(*args, **kwargs) [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] with excutils.save_and_reraise_exception() as ectxt: [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.force_reraise() [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.049754] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise self.value [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return f(*args, **kwargs) [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return f(context, *args, **kwargs) [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exception.InstanceNotFound(instance_id=uuid) [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050127] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] nova.exception.InstanceNotFound: Instance 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 could not be found. [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] During handling of the above exception, another exception occurred: [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] exception_handler_v20(status_code, error_body) [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise client_exc(message=error_message, [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Neutron server returns request_ids: ['req-34ed1e59-746f-4b07-bb12-7c955001b94f'] [ 1059.050463] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] During handling of the above exception, another exception occurred: [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Traceback (most recent call last): [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._deallocate_network(context, instance, requested_networks) [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self.network_api.deallocate_for_instance( [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] data = neutron.list_ports(**search_opts) [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1059.050823] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.list('ports', self.ports_path, retrieve_all, [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] for r in self._pagination(collection, path, **params): [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] res = self.get(path, params=params) [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.retry_request("GET", action, body=body, [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.051938] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] return self.do_request(method, action, body=body, [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] ret = obj(*args, **kwargs) [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] self._handle_fault_response(status_code, replybody, resp) [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] raise exception.Unauthorized() [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] nova.exception.Unauthorized: Not authorized. [ 1059.052503] env[60044]: ERROR nova.compute.manager [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] [ 1059.054081] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1059.054285] env[60044]: DEBUG oslo_vmware.rw_handles [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1059.072605] env[60044]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "0e99d8ab-6b62-4ea9-b7c9-06394fa93e09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 431.458s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1059.692238] env[60044]: DEBUG nova.compute.manager [req-1ed0d375-18ab-490a-924a-487c04c1d615 req-f85ff20e-cceb-4164-8c9c-d67a385265c7 service nova] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Received event network-vif-deleted-4cbfe223-af4a-4e63-a600-d6e0ee204ee8 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1068.019567] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1068.019816] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Cleaning up deleted instances with incomplete migration {{(pid=60044) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1070.026976] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1073.014135] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1073.025082] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1073.025247] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1074.019022] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1074.019330] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1074.019426] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1074.029081] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.029287] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.029447] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.029640] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1074.030706] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-573307f6-79c9-49af-8bdf-76f43fd78dcc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.042248] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1793b0-c64f-46c1-bbf9-98c15749ef05 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.055894] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b3ac37a-a4d7-4d01-b380-8ac2d9dfe042 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.062739] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b44127c3-81bf-4f22-9bb0-6a43e6aacc19 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.092573] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181204MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1074.093204] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.093204] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.191480] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1074.191659] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1074.206543] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing inventories for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1074.221578] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Updating ProviderTree inventory for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1074.221730] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Updating inventory in ProviderTree for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1074.233682] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing aggregate associations for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca, aggregates: None {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1074.250382] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Refreshing trait associations for resource provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE {{(pid=60044) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1074.262965] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1936870a-d35a-44ff-a78c-6834b988ec28 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.270547] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ed0198-ff62-4ef2-aaf2-95fc280a840a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.299332] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10815800-4dda-40fb-a52e-adf8a7c33536 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.308317] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7423df12-8711-4fa1-ae0d-a01784462608 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.320272] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1074.329636] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1074.347075] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1074.347281] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.562310] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.562569] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.571771] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1074.619318] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.619566] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.620994] env[60044]: INFO nova.compute.claims [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1074.691334] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e8fd36e-0b00-4af4-b2cf-dce22cdb7895 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.698346] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb38ff59-0c0a-4034-916f-815c58c3ba64 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.727243] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae3b1f80-40c5-4a8a-9103-8b58fb17e504 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.734331] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21336b77-c1b0-451e-81d2-0795a7fc9b45 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.748164] env[60044]: DEBUG nova.compute.provider_tree [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1074.757074] env[60044]: DEBUG nova.scheduler.client.report [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1074.771011] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.771553] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1074.803946] env[60044]: DEBUG nova.compute.utils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1074.806041] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1074.806041] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1074.813993] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1074.861774] env[60044]: DEBUG nova.policy [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd9589808fcce4507bd6988b2a5119ff9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '583d131351dd4ef6a6db6ffd061a6a1e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 1074.880370] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1074.901764] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1074.901997] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1074.902169] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1074.902351] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1074.902511] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1074.902657] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1074.902857] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1074.903029] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1074.903198] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1074.904242] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1074.904242] env[60044]: DEBUG nova.virt.hardware [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1074.904602] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e664ac0-a35d-47fc-ba6b-31bf8afe2eeb {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.912501] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de4d56df-ed42-427a-8f26-16d095fb623c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.019174] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1075.019473] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1075.019561] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1075.125221] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Successfully created port: 42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1075.645329] env[60044]: DEBUG nova.compute.manager [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Received event network-vif-plugged-42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1075.645534] env[60044]: DEBUG oslo_concurrency.lockutils [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] Acquiring lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1075.645738] env[60044]: DEBUG oslo_concurrency.lockutils [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1075.645899] env[60044]: DEBUG oslo_concurrency.lockutils [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1075.647497] env[60044]: DEBUG nova.compute.manager [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] No waiting events found dispatching network-vif-plugged-42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1075.647694] env[60044]: WARNING nova.compute.manager [req-111d408c-177d-4113-a753-75d8867cae4f req-17e25ae1-3862-4967-b5c4-3d8e6d4c701e service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Received unexpected event network-vif-plugged-42a9cf23-8c9d-4fe6-94f5-aa582254cd76 for instance with vm_state building and task_state spawning. [ 1075.718470] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Successfully updated port: 42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1075.729252] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1075.729389] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquired lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1075.729533] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1075.763036] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1075.921897] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Updating instance_info_cache with network_info: [{"id": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "address": "fa:16:3e:2b:b0:b7", "network": {"id": "9f0b877f-48a6-4869-854a-8fb8510a6c82", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1126431951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "583d131351dd4ef6a6db6ffd061a6a1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap42a9cf23-8c", "ovs_interfaceid": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.932802] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Releasing lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1075.933101] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance network_info: |[{"id": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "address": "fa:16:3e:2b:b0:b7", "network": {"id": "9f0b877f-48a6-4869-854a-8fb8510a6c82", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1126431951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "583d131351dd4ef6a6db6ffd061a6a1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap42a9cf23-8c", "ovs_interfaceid": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1075.933489] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2b:b0:b7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '42a9cf23-8c9d-4fe6-94f5-aa582254cd76', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1075.941113] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Creating folder: Project (583d131351dd4ef6a6db6ffd061a6a1e). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1075.941593] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc2e9385-e682-4ad0-bf72-bebd207db77d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.952738] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Created folder: Project (583d131351dd4ef6a6db6ffd061a6a1e) in parent group-v449562. [ 1075.952931] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Creating folder: Instances. Parent ref: group-v449623. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1075.953189] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6856472-c98b-4659-bf40-99d6c9b9831a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.961970] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Created folder: Instances in parent group-v449623. [ 1075.962198] env[60044]: DEBUG oslo.service.loopingcall [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1075.962363] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1075.962540] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-29a7de51-0d3b-4591-899c-6cf571a35ae3 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.980723] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1075.980723] env[60044]: value = "task-2204796" [ 1075.980723] env[60044]: _type = "Task" [ 1075.980723] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1075.987675] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204796, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1076.024690] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1076.025063] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Cleaning up deleted instances {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1076.055782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] There are 9 instances to clean {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1076.055942] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.078646] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.099014] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.121599] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.140819] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.160785] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.180693] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.201995] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.224075] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Instance has had 0 of 5 cleanup attempts {{(pid=60044) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1076.491257] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204796, 'name': CreateVM_Task, 'duration_secs': 0.311942} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1076.491419] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1076.492090] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1076.492257] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1076.492573] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1076.492809] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2d6f88a6-7145-4fff-88c5-4a311678f59a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.497038] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Waiting for the task: (returnval){ [ 1076.497038] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52bae13b-e3f0-1dd4-69b9-25bbda68835c" [ 1076.497038] env[60044]: _type = "Task" [ 1076.497038] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1076.504631] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52bae13b-e3f0-1dd4-69b9-25bbda68835c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.007881] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1077.008144] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1077.008457] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1077.691532] env[60044]: DEBUG nova.compute.manager [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Received event network-changed-42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1077.691755] env[60044]: DEBUG nova.compute.manager [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Refreshing instance network info cache due to event network-changed-42a9cf23-8c9d-4fe6-94f5-aa582254cd76. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1077.691908] env[60044]: DEBUG oslo_concurrency.lockutils [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] Acquiring lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1077.692064] env[60044]: DEBUG oslo_concurrency.lockutils [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] Acquired lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1077.692229] env[60044]: DEBUG nova.network.neutron [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Refreshing network info cache for port 42a9cf23-8c9d-4fe6-94f5-aa582254cd76 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1077.914553] env[60044]: DEBUG nova.network.neutron [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Updated VIF entry in instance network info cache for port 42a9cf23-8c9d-4fe6-94f5-aa582254cd76. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1077.914985] env[60044]: DEBUG nova.network.neutron [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Updating instance_info_cache with network_info: [{"id": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "address": "fa:16:3e:2b:b0:b7", "network": {"id": "9f0b877f-48a6-4869-854a-8fb8510a6c82", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1126431951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "583d131351dd4ef6a6db6ffd061a6a1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap42a9cf23-8c", "ovs_interfaceid": "42a9cf23-8c9d-4fe6-94f5-aa582254cd76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1077.923597] env[60044]: DEBUG oslo_concurrency.lockutils [req-4c61a01a-1564-40f5-822c-e06f4760506e req-c21c7e09-496a-4957-8a33-8ddba7d7f49b service nova] Releasing lock "refresh_cache-4754c01f-d312-4b2a-af5a-a34c5bcb42eb" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1078.238449] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1078.238645] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1078.238731] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1078.248411] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1078.248555] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1079.018687] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1104.556666] env[60044]: WARNING oslo_vmware.rw_handles [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1104.556666] env[60044]: ERROR oslo_vmware.rw_handles [ 1104.557512] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1104.559123] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1104.559388] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Copying Virtual Disk [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/b674f116-7b4f-4cf2-8406-5513642308c7/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1104.559704] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c3fa5bb2-631a-45d7-b821-e9ac1937eb50 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1104.567989] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 1104.567989] env[60044]: value = "task-2204797" [ 1104.567989] env[60044]: _type = "Task" [ 1104.567989] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1104.575409] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204797, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1105.078364] env[60044]: DEBUG oslo_vmware.exceptions [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1105.078617] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1105.079189] env[60044]: ERROR nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.079189] env[60044]: Faults: ['InvalidArgument'] [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] Traceback (most recent call last): [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] yield resources [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] self.driver.spawn(context, instance, image_meta, [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] self._fetch_image_if_missing(context, vi) [ 1105.079189] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] image_cache(vi, tmp_image_ds_loc) [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] vm_util.copy_virtual_disk( [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] session._wait_for_task(vmdk_copy_task) [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] return self.wait_for_task(task_ref) [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] return evt.wait() [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] result = hub.switch() [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1105.079557] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] return self.greenlet.switch() [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] self.f(*self.args, **self.kw) [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] raise exceptions.translate_fault(task_info.error) [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] Faults: ['InvalidArgument'] [ 1105.079887] env[60044]: ERROR nova.compute.manager [instance: e84f3fe9-d377-4018-8874-972d1f888208] [ 1105.079887] env[60044]: INFO nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Terminating instance [ 1105.081127] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1105.081339] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1105.081559] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b90eaf62-3faf-4102-b468-7421bf7bb4ee {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.084693] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1105.084926] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1105.085633] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb99352f-b068-4251-9b50-e9c50de2eab4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.092445] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1105.092665] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6b6c4220-cd22-4699-a18a-6598c2be5bf1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.094827] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1105.095089] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1105.095997] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1022a40d-86c5-41b3-a389-7c0deca8daf0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.100897] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 1105.100897] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a26ced-8bf1-a613-8fd6-c406d445e77d" [ 1105.100897] env[60044]: _type = "Task" [ 1105.100897] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1105.107944] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a26ced-8bf1-a613-8fd6-c406d445e77d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1105.163136] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1105.163355] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1105.163556] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Deleting the datastore file [datastore2] e84f3fe9-d377-4018-8874-972d1f888208 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1105.163828] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0ba1f982-2c18-43e4-9ffd-dd61cd8bc2b4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.170695] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Waiting for the task: (returnval){ [ 1105.170695] env[60044]: value = "task-2204799" [ 1105.170695] env[60044]: _type = "Task" [ 1105.170695] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1105.181662] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204799, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1105.202061] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_power_states {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.214083] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Getting list of instances from cluster (obj){ [ 1105.214083] env[60044]: value = "domain-c8" [ 1105.214083] env[60044]: _type = "ClusterComputeResource" [ 1105.214083] env[60044]: } {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1105.215089] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818e9501-a675-4328-b8b7-8d01f9c6a45b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.228844] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Got total of 4 instances {{(pid=60044) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1105.229011] env[60044]: WARNING nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] While synchronizing instance power states, found 1 instances in the database and 4 instances on the hypervisor. [ 1105.229180] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Triggering sync for uuid 4754c01f-d312-4b2a-af5a-a34c5bcb42eb {{(pid=60044) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1105.229500] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1105.611206] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1105.611560] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating directory with path [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1105.611676] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b98b4795-6282-4d1d-8812-d48989265601 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.622590] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created directory with path [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1105.622772] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Fetch image to [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1105.622920] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1105.623625] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c0803c-fe25-4426-8a32-8ddf3a5ddfe7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.630025] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71cd42ec-bf9c-4443-83a5-bb4020653765 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.638707] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f3f1825-f075-4dd1-ae18-a6c1b44ca727 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.669198] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbe6a318-d01c-4c87-a61a-56ece9ff22a1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.679309] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c3237b6f-49bd-4c7c-8f8d-9bd092860dfa {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.680886] env[60044]: DEBUG oslo_vmware.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Task: {'id': task-2204799, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06824} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1105.681124] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1105.681302] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1105.681460] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1105.681631] env[60044]: INFO nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1105.684023] env[60044]: DEBUG nova.compute.claims [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1105.684195] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1105.684401] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1105.704811] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1105.709761] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1105.710508] env[60044]: DEBUG nova.compute.utils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance e84f3fe9-d377-4018-8874-972d1f888208 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1105.711941] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1105.712122] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1105.712292] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1105.712447] env[60044]: DEBUG nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1105.712612] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1105.739261] env[60044]: DEBUG nova.network.neutron [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1105.748066] env[60044]: INFO nova.compute.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Took 0.04 seconds to deallocate network for instance. [ 1105.750930] env[60044]: DEBUG oslo_vmware.rw_handles [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1105.809258] env[60044]: DEBUG oslo_vmware.rw_handles [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1105.809433] env[60044]: DEBUG oslo_vmware.rw_handles [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1105.824790] env[60044]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e84f3fe9-d377-4018-8874-972d1f888208" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 390.075s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1131.046803] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1135.018611] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1135.018925] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1135.018964] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1135.028850] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1135.029064] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1135.029227] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1135.029378] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1135.030432] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98af908c-0ad2-4a28-9f4d-7a1c6681b14b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.039760] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8018a0d8-167e-4778-bc6e-45131cc372f4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.053404] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3d15fb4-a06a-4ba5-b2a8-62e1d7dfedb5 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.059421] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-602dbdb1-2329-47c0-8343-22ccedf1a548 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.088109] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181199MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1135.088243] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1135.088424] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1135.123372] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1135.123561] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1135.123699] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1135.149756] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b516d98-7b5a-4228-abbb-d4ed3e0563e6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.156430] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0e02b2-c01e-4159-86f5-c97271d71189 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.185491] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d691e0eb-bdac-4028-9ef6-97e552eeca30 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.192027] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5c6009d-9db2-471f-bc66-21526c4ad6f0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1135.204297] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1135.212243] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1135.226444] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1135.226614] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1136.226488] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1136.226755] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1137.014625] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1137.018273] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.019681] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.020068] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1139.020068] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1139.029915] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Skipping network cache update for instance because it is Building. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1139.030096] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1140.019207] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1151.238622] env[60044]: WARNING oslo_vmware.rw_handles [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1151.238622] env[60044]: ERROR oslo_vmware.rw_handles [ 1151.239309] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1151.241132] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1151.241411] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Copying Virtual Disk [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/d6f92e93-59f3-4109-acd9-9ff4a53afd72/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1151.241754] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-713df99f-3cd1-408b-b05c-c78dfd7dd9b1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.250422] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 1151.250422] env[60044]: value = "task-2204800" [ 1151.250422] env[60044]: _type = "Task" [ 1151.250422] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1151.257850] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204800, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1151.760805] env[60044]: DEBUG oslo_vmware.exceptions [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1151.761053] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1151.761623] env[60044]: ERROR nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.761623] env[60044]: Faults: ['InvalidArgument'] [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] yield resources [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.driver.spawn(context, instance, image_meta, [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._fetch_image_if_missing(context, vi) [ 1151.761623] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] image_cache(vi, tmp_image_ds_loc) [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] vm_util.copy_virtual_disk( [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] session._wait_for_task(vmdk_copy_task) [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.wait_for_task(task_ref) [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return evt.wait() [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] result = hub.switch() [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1151.762128] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.greenlet.switch() [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.f(*self.args, **self.kw) [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise exceptions.translate_fault(task_info.error) [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Faults: ['InvalidArgument'] [ 1151.762499] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1151.762499] env[60044]: INFO nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Terminating instance [ 1151.763461] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1151.763660] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1151.763880] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0b3c960f-d281-4885-8c0b-decd7ca43b1e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.766151] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1151.766341] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1151.767028] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5814ef39-0dbd-4ca6-94b7-26952fd7c2f6 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.773504] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1151.773726] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-055db676-3866-46bc-8c04-ad83e692388b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.775799] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1151.776028] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1151.776981] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-077ad724-ac72-4194-bc23-2c4802b4a29c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.781641] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Waiting for the task: (returnval){ [ 1151.781641] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]5266d081-7fdf-0f75-ac1c-b802de579f1e" [ 1151.781641] env[60044]: _type = "Task" [ 1151.781641] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1151.788470] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]5266d081-7fdf-0f75-ac1c-b802de579f1e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1151.843010] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1151.843245] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1151.843421] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Deleting the datastore file [datastore2] 0d87148b-1493-4777-a8b3-b94a64e8eca6 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1151.843673] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-032b7b91-2597-4121-b4b3-23fdf806ded4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.849937] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Waiting for the task: (returnval){ [ 1151.849937] env[60044]: value = "task-2204802" [ 1151.849937] env[60044]: _type = "Task" [ 1151.849937] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1151.857446] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204802, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1152.291937] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1152.292384] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Creating directory with path [datastore2] vmware_temp/ebe64f98-bedb-4827-b824-d1139d4be95e/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1152.292427] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05a6112c-d7ec-4469-9c3e-de58a4f545b4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.304176] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Created directory with path [datastore2] vmware_temp/ebe64f98-bedb-4827-b824-d1139d4be95e/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1152.304431] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Fetch image to [datastore2] vmware_temp/ebe64f98-bedb-4827-b824-d1139d4be95e/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1152.304663] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/ebe64f98-bedb-4827-b824-d1139d4be95e/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1152.305680] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-068f2f24-d220-4b22-9a77-67b441c131cd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.314093] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437ebfd6-54d8-4be3-8a04-0bfdc7523b4e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.326265] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e28c413-dd01-4c3b-a099-b7fbb3d0cb19 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.366420] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b374b31-ca07-475e-bdc7-d28869066637 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.373190] env[60044]: DEBUG oslo_vmware.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Task: {'id': task-2204802, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07155} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1152.374733] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1152.374918] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1152.375020] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1152.375199] env[60044]: INFO nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1152.377226] env[60044]: DEBUG nova.compute.claims [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1152.377394] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1152.377605] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1152.380051] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e5599b44-b6bf-4f75-a4e4-4c65e623c37e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.400868] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1152.406584] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1152.407264] env[60044]: DEBUG nova.compute.utils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1152.408675] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1152.408840] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1152.408994] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1152.409155] env[60044]: DEBUG nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1152.409308] env[60044]: DEBUG nova.network.neutron [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1152.450451] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1152.451232] env[60044]: ERROR nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] result = getattr(controller, method)(*args, **kwargs) [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._get(image_id) [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1152.451232] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] resp, body = self.http_client.get(url, headers=header) [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.request(url, 'GET', **kwargs) [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._handle_response(resp) [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exc.from_response(resp, resp.content) [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] During handling of the above exception, another exception occurred: [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1152.451708] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] yield resources [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.driver.spawn(context, instance, image_meta, [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._fetch_image_if_missing(context, vi) [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image_fetch(context, vi, tmp_image_ds_loc) [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] images.fetch_image( [ 1152.452334] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] metadata = IMAGE_API.get(context, image_ref) [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return session.show(context, image_id, [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] _reraise_translated_image_exception(image_id) [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise new_exc.with_traceback(exc_trace) [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] result = getattr(controller, method)(*args, **kwargs) [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1152.452867] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._get(image_id) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] resp, body = self.http_client.get(url, headers=header) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.request(url, 'GET', **kwargs) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._handle_response(resp) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exc.from_response(resp, resp.content) [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1152.453238] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1152.453578] env[60044]: INFO nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Terminating instance [ 1152.453578] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1152.453578] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1152.453754] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1152.453937] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1152.454169] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c1f4604-f228-47a0-b2fb-bbfd5ff2bcaf {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.456619] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5284e338-5740-492a-a8f6-5679b2bf4239 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.464746] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1152.465726] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a83ba2cf-8536-44cd-84cb-14a9f117d991 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.467099] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1152.467306] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1152.467953] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1850ee65-248a-437e-82e9-9e0a89fd5ad0 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.473506] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 1152.473506] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]524eaafc-9f72-26ae-1342-fdc7567f3ef0" [ 1152.473506] env[60044]: _type = "Task" [ 1152.473506] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1152.481287] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]524eaafc-9f72-26ae-1342-fdc7567f3ef0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1152.504941] env[60044]: DEBUG neutronclient.v2_0.client [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1152.506419] env[60044]: ERROR nova.compute.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.driver.spawn(context, instance, image_meta, [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._fetch_image_if_missing(context, vi) [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] image_cache(vi, tmp_image_ds_loc) [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1152.506419] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] vm_util.copy_virtual_disk( [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] session._wait_for_task(vmdk_copy_task) [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.wait_for_task(task_ref) [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return evt.wait() [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] result = hub.switch() [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.greenlet.switch() [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.f(*self.args, **self.kw) [ 1152.506773] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise exceptions.translate_fault(task_info.error) [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Faults: ['InvalidArgument'] [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] During handling of the above exception, another exception occurred: [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._build_and_run_instance(context, instance, image, [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] with excutils.save_and_reraise_exception(): [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.force_reraise() [ 1152.507152] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise self.value [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] with self.rt.instance_claim(context, instance, node, allocs, [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.abort() [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return f(*args, **kwargs) [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._unset_instance_host_and_node(instance) [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1152.507565] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] instance.save() [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] updates, result = self.indirection_api.object_action( [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return cctxt.call(context, 'object_action', objinst=objinst, [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] result = self.transport._send( [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self._driver.send(target, ctxt, message, [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1152.507935] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise result [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] nova.exception_Remote.InstanceNotFound_Remote: Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 could not be found. [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return getattr(target, method)(*args, **kwargs) [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return fn(self, *args, **kwargs) [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] old_ref, inst_ref = db.instance_update_and_get_original( [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return f(*args, **kwargs) [ 1152.509340] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] with excutils.save_and_reraise_exception() as ectxt: [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.force_reraise() [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise self.value [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return f(*args, **kwargs) [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return f(context, *args, **kwargs) [ 1152.509702] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise exception.InstanceNotFound(instance_id=uuid) [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] nova.exception.InstanceNotFound: Instance 0d87148b-1493-4777-a8b3-b94a64e8eca6 could not be found. [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] During handling of the above exception, another exception occurred: [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.510074] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] exception_handler_v20(status_code, error_body) [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise client_exc(message=error_message, [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Neutron server returns request_ids: ['req-c49f8ef2-c3c3-4acb-ab2b-d5eb0b21263a'] [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] During handling of the above exception, another exception occurred: [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Traceback (most recent call last): [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._deallocate_network(context, instance, requested_networks) [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1152.510457] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self.network_api.deallocate_for_instance( [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] data = neutron.list_ports(**search_opts) [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.list('ports', self.ports_path, retrieve_all, [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] for r in self._pagination(collection, path, **params): [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] res = self.get(path, params=params) [ 1152.510771] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.retry_request("GET", action, body=body, [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] return self.do_request(method, action, body=body, [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] ret = obj(*args, **kwargs) [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] self._handle_fault_response(status_code, replybody, resp) [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1152.511103] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] raise exception.Unauthorized() [ 1152.511423] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] nova.exception.Unauthorized: Not authorized. [ 1152.511423] env[60044]: ERROR nova.compute.manager [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] [ 1152.529477] env[60044]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "0d87148b-1493-4777-a8b3-b94a64e8eca6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 436.755s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1152.872555] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1152.872737] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1152.872912] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Deleting the datastore file [datastore2] f3566a4b-8fe0-4c85-9c45-7c67cfd30323 {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1152.873199] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-46b93c39-f6b3-472c-8525-e7974084cb3d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.879788] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Waiting for the task: (returnval){ [ 1152.879788] env[60044]: value = "task-2204804" [ 1152.879788] env[60044]: _type = "Task" [ 1152.879788] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1152.886661] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Task: {'id': task-2204804, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1152.982772] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1152.983123] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Creating directory with path [datastore2] vmware_temp/6ecf4fbc-636d-4fcc-8cb2-ab1f384f2c96/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1152.983350] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d42f69d2-a0fa-40e0-9885-a98e5737436d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1152.993811] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created directory with path [datastore2] vmware_temp/6ecf4fbc-636d-4fcc-8cb2-ab1f384f2c96/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1152.993996] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Fetch image to [datastore2] vmware_temp/6ecf4fbc-636d-4fcc-8cb2-ab1f384f2c96/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1152.994176] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/6ecf4fbc-636d-4fcc-8cb2-ab1f384f2c96/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1152.994862] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21157837-7629-49b1-a571-37150fa9e6e8 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.001176] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e72694f-9f07-4469-bfff-d038b408ead4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.009683] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51d673f7-27b7-4f6e-b260-ae20e5ad9641 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.040271] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-081381e9-e01d-4d05-b3d1-727b4610ef88 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.045293] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-150bf126-d08e-4304-8b0e-78eaf979b486 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.067044] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1153.162807] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1153.163629] env[60044]: ERROR nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] result = getattr(controller, method)(*args, **kwargs) [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._get(image_id) [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.163629] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] resp, body = self.http_client.get(url, headers=header) [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.request(url, 'GET', **kwargs) [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._handle_response(resp) [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exc.from_response(resp, resp.content) [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] During handling of the above exception, another exception occurred: [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.163993] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] yield resources [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.driver.spawn(context, instance, image_meta, [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._fetch_image_if_missing(context, vi) [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image_fetch(context, vi, tmp_image_ds_loc) [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] images.fetch_image( [ 1153.164425] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] metadata = IMAGE_API.get(context, image_ref) [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return session.show(context, image_id, [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] _reraise_translated_image_exception(image_id) [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise new_exc.with_traceback(exc_trace) [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] result = getattr(controller, method)(*args, **kwargs) [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.164863] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._get(image_id) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] resp, body = self.http_client.get(url, headers=header) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.request(url, 'GET', **kwargs) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._handle_response(resp) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exc.from_response(resp, resp.content) [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1153.165305] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.165621] env[60044]: INFO nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Terminating instance [ 1153.165621] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1153.165699] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1153.166337] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1153.166523] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1153.166750] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-085296cd-f46d-4dda-b2dc-dafb9d563128 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.169553] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d17540e7-0ed5-4967-85d6-c119023026dd {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.176190] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1153.176393] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5b7397fa-a2c4-474a-a80b-f65601f59a62 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.178498] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1153.178669] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1153.179683] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-551addb6-8dca-4935-98a9-4c3e89ce8919 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.184594] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Waiting for the task: (returnval){ [ 1153.184594] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a6a3be-267e-3e70-4049-9a05af1d5cf4" [ 1153.184594] env[60044]: _type = "Task" [ 1153.184594] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1153.191352] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52a6a3be-267e-3e70-4049-9a05af1d5cf4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1153.246242] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1153.246469] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1153.246686] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Deleting the datastore file [datastore2] c0f7ff03-5203-418d-aa9e-420448e9dbfb {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1153.246960] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9048d848-c2fd-4acc-b4ac-e69db4168b6d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.252467] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Waiting for the task: (returnval){ [ 1153.252467] env[60044]: value = "task-2204806" [ 1153.252467] env[60044]: _type = "Task" [ 1153.252467] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1153.260016] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': task-2204806, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1153.388927] env[60044]: DEBUG oslo_vmware.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Task: {'id': task-2204804, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074768} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1153.389282] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1153.389339] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1153.389533] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1153.389703] env[60044]: INFO nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Took 0.94 seconds to destroy the instance on the hypervisor. [ 1153.391733] env[60044]: DEBUG nova.compute.claims [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1153.391899] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.392122] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.416682] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.417381] env[60044]: DEBUG nova.compute.utils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1153.418684] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1153.418847] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1153.419010] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1153.419184] env[60044]: DEBUG nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1153.419342] env[60044]: DEBUG nova.network.neutron [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1153.510908] env[60044]: DEBUG neutronclient.v2_0.client [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1153.512448] env[60044]: ERROR nova.compute.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] result = getattr(controller, method)(*args, **kwargs) [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._get(image_id) [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.512448] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] resp, body = self.http_client.get(url, headers=header) [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.request(url, 'GET', **kwargs) [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._handle_response(resp) [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exc.from_response(resp, resp.content) [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] During handling of the above exception, another exception occurred: [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.512823] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.driver.spawn(context, instance, image_meta, [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._fetch_image_if_missing(context, vi) [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image_fetch(context, vi, tmp_image_ds_loc) [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] images.fetch_image( [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] metadata = IMAGE_API.get(context, image_ref) [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1153.513175] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return session.show(context, image_id, [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] _reraise_translated_image_exception(image_id) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise new_exc.with_traceback(exc_trace) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] result = getattr(controller, method)(*args, **kwargs) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._get(image_id) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.513536] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] resp, body = self.http_client.get(url, headers=header) [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.request(url, 'GET', **kwargs) [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._handle_response(resp) [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exc.from_response(resp, resp.content) [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] During handling of the above exception, another exception occurred: [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.513880] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._build_and_run_instance(context, instance, image, [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] with excutils.save_and_reraise_exception(): [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.force_reraise() [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise self.value [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] with self.rt.instance_claim(context, instance, node, allocs, [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.abort() [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1153.514245] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return f(*args, **kwargs) [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._unset_instance_host_and_node(instance) [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] instance.save() [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] updates, result = self.indirection_api.object_action( [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return cctxt.call(context, 'object_action', objinst=objinst, [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1153.514688] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] result = self.transport._send( [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._driver.send(target, ctxt, message, [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise result [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] nova.exception_Remote.InstanceNotFound_Remote: Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 could not be found. [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return getattr(target, method)(*args, **kwargs) [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515052] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return fn(self, *args, **kwargs) [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] old_ref, inst_ref = db.instance_update_and_get_original( [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return f(*args, **kwargs) [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] with excutils.save_and_reraise_exception() as ectxt: [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.force_reraise() [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515432] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise self.value [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return f(*args, **kwargs) [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return f(context, *args, **kwargs) [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exception.InstanceNotFound(instance_id=uuid) [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.515954] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] nova.exception.InstanceNotFound: Instance f3566a4b-8fe0-4c85-9c45-7c67cfd30323 could not be found. [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] During handling of the above exception, another exception occurred: [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] exception_handler_v20(status_code, error_body) [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise client_exc(message=error_message, [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Neutron server returns request_ids: ['req-c6adfb18-b0ea-4c22-a6c7-8cfe5704be38'] [ 1153.516404] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] During handling of the above exception, another exception occurred: [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Traceback (most recent call last): [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._deallocate_network(context, instance, requested_networks) [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self.network_api.deallocate_for_instance( [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] data = neutron.list_ports(**search_opts) [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1153.516920] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.list('ports', self.ports_path, retrieve_all, [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] for r in self._pagination(collection, path, **params): [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] res = self.get(path, params=params) [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.retry_request("GET", action, body=body, [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.517344] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] return self.do_request(method, action, body=body, [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] ret = obj(*args, **kwargs) [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] self._handle_fault_response(status_code, replybody, resp) [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] raise exception.Unauthorized() [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] nova.exception.Unauthorized: Not authorized. [ 1153.517727] env[60044]: ERROR nova.compute.manager [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] [ 1153.535959] env[60044]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "f3566a4b-8fe0-4c85-9c45-7c67cfd30323" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 317.433s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.694202] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1153.694411] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Creating directory with path [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1153.694628] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9cf005b-0a57-48ce-9351-8dd293baf80c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.705763] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Created directory with path [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1153.705949] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Fetch image to [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1153.706133] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1153.706836] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92f41ab-d6a6-42a5-8855-9071c48f526d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.713286] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-171fb715-3151-4715-946b-1cb7c9a997c1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.721821] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3231ae7-9827-4a5d-9c98-0b9d22ffe5e9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.751829] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff6b30e7-56b6-4e0b-a9df-a9ceccc64549 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.761852] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-78099ac5-0569-407a-989d-0f230e635495 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.763430] env[60044]: DEBUG oslo_vmware.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Task: {'id': task-2204806, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064906} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1153.763673] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1153.763846] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1153.764021] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1153.764198] env[60044]: INFO nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1153.766239] env[60044]: DEBUG nova.compute.claims [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1153.766400] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.766635] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.785728] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1153.790840] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.791499] env[60044]: DEBUG nova.compute.utils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb could not be found. {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1153.793203] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Instance disappeared during build. {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1153.793310] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1153.793462] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1153.793626] env[60044]: DEBUG nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1153.793780] env[60044]: DEBUG nova.network.neutron [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1153.875135] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1153.929915] env[60044]: DEBUG neutronclient.v2_0.client [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60044) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1153.931524] env[60044]: ERROR nova.compute.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] result = getattr(controller, method)(*args, **kwargs) [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._get(image_id) [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.931524] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] resp, body = self.http_client.get(url, headers=header) [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.request(url, 'GET', **kwargs) [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._handle_response(resp) [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exc.from_response(resp, resp.content) [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] During handling of the above exception, another exception occurred: [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.931965] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.driver.spawn(context, instance, image_meta, [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._fetch_image_if_missing(context, vi) [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image_fetch(context, vi, tmp_image_ds_loc) [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] images.fetch_image( [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] metadata = IMAGE_API.get(context, image_ref) [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1153.932371] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return session.show(context, image_id, [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] _reraise_translated_image_exception(image_id) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise new_exc.with_traceback(exc_trace) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] result = getattr(controller, method)(*args, **kwargs) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._get(image_id) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1153.932740] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] resp, body = self.http_client.get(url, headers=header) [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.request(url, 'GET', **kwargs) [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._handle_response(resp) [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exc.from_response(resp, resp.content) [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] nova.exception.ImageNotAuthorized: Not authorized for image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c. [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] During handling of the above exception, another exception occurred: [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.933117] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._build_and_run_instance(context, instance, image, [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] with excutils.save_and_reraise_exception(): [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.force_reraise() [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise self.value [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] with self.rt.instance_claim(context, instance, node, allocs, [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.abort() [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1153.933462] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return f(*args, **kwargs) [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._unset_instance_host_and_node(instance) [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] instance.save() [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] updates, result = self.indirection_api.object_action( [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return cctxt.call(context, 'object_action', objinst=objinst, [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1153.933931] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] result = self.transport._send( [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._driver.send(target, ctxt, message, [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise result [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] nova.exception_Remote.InstanceNotFound_Remote: Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb could not be found. [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return getattr(target, method)(*args, **kwargs) [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934336] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return fn(self, *args, **kwargs) [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] old_ref, inst_ref = db.instance_update_and_get_original( [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return f(*args, **kwargs) [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] with excutils.save_and_reraise_exception() as ectxt: [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.force_reraise() [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.934730] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise self.value [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return f(*args, **kwargs) [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return f(context, *args, **kwargs) [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exception.InstanceNotFound(instance_id=uuid) [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935174] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] nova.exception.InstanceNotFound: Instance c0f7ff03-5203-418d-aa9e-420448e9dbfb could not be found. [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] During handling of the above exception, another exception occurred: [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] exception_handler_v20(status_code, error_body) [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise client_exc(message=error_message, [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Neutron server returns request_ids: ['req-7ab96f3e-6b08-4bc4-869e-a5be17afae62'] [ 1153.935628] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] During handling of the above exception, another exception occurred: [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Traceback (most recent call last): [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._deallocate_network(context, instance, requested_networks) [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self.network_api.deallocate_for_instance( [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] data = neutron.list_ports(**search_opts) [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1153.936058] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.list('ports', self.ports_path, retrieve_all, [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] for r in self._pagination(collection, path, **params): [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] res = self.get(path, params=params) [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.retry_request("GET", action, body=body, [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.936440] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] return self.do_request(method, action, body=body, [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] ret = obj(*args, **kwargs) [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] self._handle_fault_response(status_code, replybody, resp) [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] raise exception.Unauthorized() [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] nova.exception.Unauthorized: Not authorized. [ 1153.936795] env[60044]: ERROR nova.compute.manager [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] [ 1153.936795] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1153.937094] env[60044]: DEBUG oslo_vmware.rw_handles [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1153.953462] env[60044]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "c0f7ff03-5203-418d-aa9e-420448e9dbfb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 290.656s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1191.020252] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1195.018522] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1195.018972] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1196.015601] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1196.026978] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1197.019427] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1197.019658] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1197.029446] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1197.029756] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1197.029794] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1197.029932] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1197.030980] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86e90cfa-f203-486c-ba27-c53836f1fcd9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.040408] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7bccfa4-8913-4530-82cb-708ba0f7ed8d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.053596] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b53776f-c6f1-418e-b9fa-1cff76545841 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.059526] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38dc3313-38e8-4eee-b69f-f63b61634f6a {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.087711] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181275MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1197.087711] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1197.087711] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1197.124334] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60044) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1197.124529] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1197.124669] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1197.149876] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f5ff191-2d85-4bb7-ba99-61bdb7de4ad4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.156535] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de9051b4-96d6-4bce-bd0e-9fd0797f1c35 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.186040] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc71dc2e-3e23-451a-94b2-21945f1f9b61 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.192332] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f0ff54-c231-402f-ae01-9c0bec2d7761 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.204457] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1197.212495] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1197.224607] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1197.224771] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1198.224598] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1198.224963] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1199.588624] env[60044]: WARNING oslo_vmware.rw_handles [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1199.588624] env[60044]: ERROR oslo_vmware.rw_handles [ 1199.589425] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1199.591040] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1199.591310] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Copying Virtual Disk [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/c0828bbf-dd45-4a1c-8fac-072c8d736d58/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1199.591592] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-93f65185-7e99-4c8b-b28d-31ff444bdfc9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1199.599692] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Waiting for the task: (returnval){ [ 1199.599692] env[60044]: value = "task-2204807" [ 1199.599692] env[60044]: _type = "Task" [ 1199.599692] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1199.606796] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Task: {'id': task-2204807, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1200.019029] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1200.109774] env[60044]: DEBUG oslo_vmware.exceptions [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1200.111034] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1200.111034] env[60044]: ERROR nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.111034] env[60044]: Faults: ['InvalidArgument'] [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Traceback (most recent call last): [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] yield resources [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self.driver.spawn(context, instance, image_meta, [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1200.111034] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self._fetch_image_if_missing(context, vi) [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] image_cache(vi, tmp_image_ds_loc) [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] vm_util.copy_virtual_disk( [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] session._wait_for_task(vmdk_copy_task) [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return self.wait_for_task(task_ref) [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return evt.wait() [ 1200.111405] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] result = hub.switch() [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return self.greenlet.switch() [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self.f(*self.args, **self.kw) [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] raise exceptions.translate_fault(task_info.error) [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Faults: ['InvalidArgument'] [ 1200.111743] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] [ 1200.111743] env[60044]: INFO nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Terminating instance [ 1200.113955] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1200.114167] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1200.114913] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2670f5-c173-49c9-b115-7dc4a9b4a14d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.121259] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1200.121469] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-52b51565-72b4-4897-9011-a218ad5e0044 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.194021] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1200.194244] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1200.194405] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Deleting the datastore file [datastore2] 4754c01f-d312-4b2a-af5a-a34c5bcb42eb {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1200.194643] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a91d7c36-f385-4ac3-b1b6-7e90f392074e {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.200393] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Waiting for the task: (returnval){ [ 1200.200393] env[60044]: value = "task-2204809" [ 1200.200393] env[60044]: _type = "Task" [ 1200.200393] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1200.208245] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Task: {'id': task-2204809, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1200.710554] env[60044]: DEBUG oslo_vmware.api [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Task: {'id': task-2204809, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066753} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1200.711034] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1200.711034] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1200.711189] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1200.711321] env[60044]: INFO nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1200.713486] env[60044]: DEBUG nova.compute.claims [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1200.713649] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1200.713856] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1200.776214] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492bb658-6d23-4aa8-b2d1-eb6aeb17364c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.783092] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a69c8080-ec20-49e9-9997-d0cb924b9410 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.812880] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ced369cc-6b12-412b-aab4-60d7acb7099f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.820014] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d53195-d0bb-4d1d-80e4-2d997b24371d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1200.832672] env[60044]: DEBUG nova.compute.provider_tree [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1200.842391] env[60044]: DEBUG nova.scheduler.client.report [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1200.855223] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.141s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1200.855723] env[60044]: ERROR nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.855723] env[60044]: Faults: ['InvalidArgument'] [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Traceback (most recent call last): [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self.driver.spawn(context, instance, image_meta, [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self._fetch_image_if_missing(context, vi) [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] image_cache(vi, tmp_image_ds_loc) [ 1200.855723] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] vm_util.copy_virtual_disk( [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] session._wait_for_task(vmdk_copy_task) [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return self.wait_for_task(task_ref) [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return evt.wait() [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] result = hub.switch() [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] return self.greenlet.switch() [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1200.856064] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] self.f(*self.args, **self.kw) [ 1200.856412] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1200.856412] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] raise exceptions.translate_fault(task_info.error) [ 1200.856412] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.856412] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Faults: ['InvalidArgument'] [ 1200.856412] env[60044]: ERROR nova.compute.manager [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] [ 1200.856412] env[60044]: DEBUG nova.compute.utils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1200.858078] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Build of instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb was re-scheduled: A specified parameter was not correct: fileType [ 1200.858078] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1200.858464] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1200.858631] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1200.858796] env[60044]: DEBUG nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1200.858981] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1201.018843] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1201.019085] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1201.019216] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1201.027215] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1201.112274] env[60044]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1201.125825] env[60044]: INFO nova.compute.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Took 0.27 seconds to deallocate network for instance. [ 1201.206517] env[60044]: INFO nova.scheduler.client.report [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Deleted allocations for instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb [ 1201.222971] env[60044]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 126.660s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1201.223829] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 95.994s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1201.223829] env[60044]: INFO nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] During sync_power_state the instance has a pending task (spawning). Skip. [ 1201.223829] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "4754c01f-d312-4b2a-af5a-a34c5bcb42eb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.765109] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.765419] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.774260] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Starting instance... {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1206.820194] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.820499] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.822638] env[60044]: INFO nova.compute.claims [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1206.890662] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-609e481f-2d54-4bb2-956e-d699c2051597 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1206.898319] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52177d7a-151b-4d25-be85-00681a474699 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1206.927559] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43f5e239-6454-4a17-8708-5f96249478fe {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1206.934435] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a001c5ca-e308-4181-a17b-b0e0a5d3f259 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1206.947530] env[60044]: DEBUG nova.compute.provider_tree [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1206.957425] env[60044]: DEBUG nova.scheduler.client.report [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1206.971301] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.971757] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Start building networks asynchronously for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1207.004813] env[60044]: DEBUG nova.compute.utils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Using /dev/sd instead of None {{(pid=60044) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1207.006264] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Allocating IP information in the background. {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1207.006471] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] allocate_for_instance() {{(pid=60044) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1207.014412] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Start building block device mappings for instance. {{(pid=60044) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1207.060582] env[60044]: DEBUG nova.policy [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '549edeef8bbc465c941651c9e0523f27', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c73de74d24b547d686045b7848f07007', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60044) authorize /opt/stack/nova/nova/policy.py:203}} [ 1207.074498] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Start spawning the instance on the hypervisor. {{(pid=60044) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1207.095081] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:00:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:00:05Z,direct_url=,disk_format='vmdk',id=856e89ba-b7a4-4a81-ad9d-2997fe327c0c,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='6e30b729ea7246768c33961f1716d5e2',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:00:06Z,virtual_size=,visibility=), allow threads: False {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1207.095532] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Flavor limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1207.095532] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Image limits 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1207.095704] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Flavor pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1207.095757] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Image pref 0:0:0 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1207.095896] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60044) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1207.096115] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1207.096273] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1207.096431] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Got 1 possible topologies {{(pid=60044) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1207.096586] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1207.096747] env[60044]: DEBUG nova.virt.hardware [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60044) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1207.097584] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f15e27-788f-4661-8a4f-bae1d65c8861 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1207.105360] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72ae9b82-cf04-4efe-a095-77c53bceb824 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1207.331997] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Successfully created port: b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1207.813062] env[60044]: DEBUG nova.compute.manager [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Received event network-vif-plugged-b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1207.813341] env[60044]: DEBUG oslo_concurrency.lockutils [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] Acquiring lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1207.813512] env[60044]: DEBUG oslo_concurrency.lockutils [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] Lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1207.813674] env[60044]: DEBUG oslo_concurrency.lockutils [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] Lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1207.813833] env[60044]: DEBUG nova.compute.manager [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] No waiting events found dispatching network-vif-plugged-b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1207.813990] env[60044]: WARNING nova.compute.manager [req-36a5bc5b-3eba-41e4-ad96-ae20d93da108 req-6bbf19df-7a17-4ecf-a1e6-14400e0b0dda service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Received unexpected event network-vif-plugged-b5e6352f-33f2-4419-844c-40443d96dc56 for instance with vm_state building and task_state spawning. [ 1207.889501] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Successfully updated port: b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1207.901045] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1207.901207] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquired lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1207.901358] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Building network info cache for instance {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1207.947212] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance cache missing network info. {{(pid=60044) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1208.092146] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Updating instance_info_cache with network_info: [{"id": "b5e6352f-33f2-4419-844c-40443d96dc56", "address": "fa:16:3e:78:8c:16", "network": {"id": "0937c21d-9390-49c7-aa42-c4de9c22364c", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-344735203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c73de74d24b547d686045b7848f07007", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e6352f-33", "ovs_interfaceid": "b5e6352f-33f2-4419-844c-40443d96dc56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1208.102782] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Releasing lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1208.103064] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance network_info: |[{"id": "b5e6352f-33f2-4419-844c-40443d96dc56", "address": "fa:16:3e:78:8c:16", "network": {"id": "0937c21d-9390-49c7-aa42-c4de9c22364c", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-344735203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c73de74d24b547d686045b7848f07007", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e6352f-33", "ovs_interfaceid": "b5e6352f-33f2-4419-844c-40443d96dc56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60044) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1208.103445] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:8c:16', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dcf5c3f7-4e33-4f21-b323-3673930b789c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b5e6352f-33f2-4419-844c-40443d96dc56', 'vif_model': 'vmxnet3'}] {{(pid=60044) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1208.111109] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Creating folder: Project (c73de74d24b547d686045b7848f07007). Parent ref: group-v449562. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1208.111594] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8691b87-659c-4abd-a447-1a143d844ddc {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1208.121992] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Created folder: Project (c73de74d24b547d686045b7848f07007) in parent group-v449562. [ 1208.122183] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Creating folder: Instances. Parent ref: group-v449626. {{(pid=60044) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1208.122404] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e46f61c-dfd7-45ca-a0de-cbd2cfcc0f2c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1208.131537] env[60044]: INFO nova.virt.vmwareapi.vm_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Created folder: Instances in parent group-v449626. [ 1208.131759] env[60044]: DEBUG oslo.service.loopingcall [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60044) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1208.131927] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Creating VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1208.132147] env[60044]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0aee121e-12b5-4ee9-a9a2-c337a3e9752d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1208.150487] env[60044]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1208.150487] env[60044]: value = "task-2204812" [ 1208.150487] env[60044]: _type = "Task" [ 1208.150487] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1208.157678] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204812, 'name': CreateVM_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1208.660400] env[60044]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204812, 'name': CreateVM_Task, 'duration_secs': 0.298222} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1208.660589] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Created VM on the ESX host {{(pid=60044) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1208.668156] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1208.668344] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1208.668661] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1208.668912] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f550ec42-c591-40df-a366-633cd5b1cb2f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1208.673697] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Waiting for the task: (returnval){ [ 1208.673697] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]527d35b2-2195-3aa2-c3d9-45f5f05b21b4" [ 1208.673697] env[60044]: _type = "Task" [ 1208.673697] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1208.682981] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]527d35b2-2195-3aa2-c3d9-45f5f05b21b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1209.184309] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1209.184668] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Processing image 856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1209.184790] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1209.184892] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquired lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1209.185081] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1209.185324] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eb0b0e67-394a-4163-ab7a-b52da64a0d0c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.193684] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1209.193877] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60044) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1209.194555] env[60044]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a39f11a0-7176-43b5-8d7b-8e1b781ccd75 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.199789] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Waiting for the task: (returnval){ [ 1209.199789] env[60044]: value = "session[52968f94-3966-d3ff-7dd2-00393c470dc2]52af3bab-ffdc-64d0-eda5-c2674ae2db60" [ 1209.199789] env[60044]: _type = "Task" [ 1209.199789] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1209.207697] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Task: {'id': session[52968f94-3966-d3ff-7dd2-00393c470dc2]52af3bab-ffdc-64d0-eda5-c2674ae2db60, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1209.710448] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Preparing fetch location {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1209.710791] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Creating directory with path [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1209.710926] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d80d5268-e167-4711-8ce1-ce67c0f15c36 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.731318] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Created directory with path [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c {{(pid=60044) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1209.731502] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Fetch image to [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1209.731663] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1209.732425] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f744d647-686e-406e-abac-791a85d00a9c {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.739319] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c12d41a6-c1cf-4095-9297-19a1c27e98b9 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.749758] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7c8a2ff-554e-4ad8-affb-c1ebb1f36915 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.779789] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6689479-e806-4573-bba0-f9a4e9b5027b {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.785664] env[60044]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e107d9c2-6f65-4306-b2ba-2ac6f42553e2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.810854] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Downloading image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1209.841912] env[60044]: DEBUG nova.compute.manager [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Received event network-changed-b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1209.842039] env[60044]: DEBUG nova.compute.manager [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Refreshing instance network info cache due to event network-changed-b5e6352f-33f2-4419-844c-40443d96dc56. {{(pid=60044) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1209.842313] env[60044]: DEBUG oslo_concurrency.lockutils [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] Acquiring lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1209.842481] env[60044]: DEBUG oslo_concurrency.lockutils [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] Acquired lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1209.842679] env[60044]: DEBUG nova.network.neutron [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Refreshing network info cache for port b5e6352f-33f2-4419-844c-40443d96dc56 {{(pid=60044) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1209.861099] env[60044]: DEBUG oslo_vmware.rw_handles [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1209.922314] env[60044]: DEBUG oslo_vmware.rw_handles [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Completed reading data from the image iterator. {{(pid=60044) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1209.922532] env[60044]: DEBUG oslo_vmware.rw_handles [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60044) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1210.136818] env[60044]: DEBUG nova.network.neutron [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Updated VIF entry in instance network info cache for port b5e6352f-33f2-4419-844c-40443d96dc56. {{(pid=60044) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1210.137165] env[60044]: DEBUG nova.network.neutron [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Updating instance_info_cache with network_info: [{"id": "b5e6352f-33f2-4419-844c-40443d96dc56", "address": "fa:16:3e:78:8c:16", "network": {"id": "0937c21d-9390-49c7-aa42-c4de9c22364c", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-344735203-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "c73de74d24b547d686045b7848f07007", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dcf5c3f7-4e33-4f21-b323-3673930b789c", "external-id": "nsx-vlan-transportzone-983", "segmentation_id": 983, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e6352f-33", "ovs_interfaceid": "b5e6352f-33f2-4419-844c-40443d96dc56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1210.149955] env[60044]: DEBUG oslo_concurrency.lockutils [req-888fc62b-d344-4972-9851-9288af2ac272 req-0b7608ce-7534-4187-b845-a2eb5d733e94 service nova] Releasing lock "refresh_cache-2d7dbbc6-07b5-4f4c-8098-d190fabc545b" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1253.021174] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1256.277131] env[60044]: WARNING oslo_vmware.rw_handles [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles response.begin() [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1256.277131] env[60044]: ERROR oslo_vmware.rw_handles [ 1256.277861] env[60044]: DEBUG nova.virt.vmwareapi.images [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Downloaded image file data 856e89ba-b7a4-4a81-ad9d-2997fe327c0c to vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk on the data store datastore2 {{(pid=60044) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1256.279542] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Caching image {{(pid=60044) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1256.279817] env[60044]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Copying Virtual Disk [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/tmp-sparse.vmdk to [datastore2] vmware_temp/36c5413f-165b-4519-8f22-39e9b331dcdf/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk {{(pid=60044) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1256.280139] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3e83400a-f122-4108-9177-d7556d7500a2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1256.288892] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Waiting for the task: (returnval){ [ 1256.288892] env[60044]: value = "task-2204813" [ 1256.288892] env[60044]: _type = "Task" [ 1256.288892] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1256.297745] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Task: {'id': task-2204813, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1256.799401] env[60044]: DEBUG oslo_vmware.exceptions [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Fault InvalidArgument not matched. {{(pid=60044) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1256.799639] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Releasing lock "[datastore2] devstack-image-cache_base/856e89ba-b7a4-4a81-ad9d-2997fe327c0c/856e89ba-b7a4-4a81-ad9d-2997fe327c0c.vmdk" {{(pid=60044) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1256.800239] env[60044]: ERROR nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1256.800239] env[60044]: Faults: ['InvalidArgument'] [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Traceback (most recent call last): [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] yield resources [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self.driver.spawn(context, instance, image_meta, [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self._fetch_image_if_missing(context, vi) [ 1256.800239] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] image_cache(vi, tmp_image_ds_loc) [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] vm_util.copy_virtual_disk( [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] session._wait_for_task(vmdk_copy_task) [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return self.wait_for_task(task_ref) [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return evt.wait() [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] result = hub.switch() [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1256.800733] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return self.greenlet.switch() [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self.f(*self.args, **self.kw) [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] raise exceptions.translate_fault(task_info.error) [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Faults: ['InvalidArgument'] [ 1256.801104] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] [ 1256.801104] env[60044]: INFO nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Terminating instance [ 1256.803279] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Start destroying the instance on the hypervisor. {{(pid=60044) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1256.803468] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Destroying instance {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1256.804183] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a652cf7e-8772-4dfd-bd97-b9995d320396 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1256.811158] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Unregistering the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1256.811361] env[60044]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-520a3d31-76d8-4776-965d-8e06ab9ad832 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1256.881659] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Unregistered the VM {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1256.881927] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Deleting contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1256.882048] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Deleting the datastore file [datastore2] 2d7dbbc6-07b5-4f4c-8098-d190fabc545b {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1256.882319] env[60044]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a853cb60-16a4-47ed-b8ff-998f0d2b45be {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1256.888028] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Waiting for the task: (returnval){ [ 1256.888028] env[60044]: value = "task-2204815" [ 1256.888028] env[60044]: _type = "Task" [ 1256.888028] env[60044]: } to complete. {{(pid=60044) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1256.895837] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Task: {'id': task-2204815, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1257.018589] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1257.018782] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60044) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1257.397879] env[60044]: DEBUG oslo_vmware.api [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Task: {'id': task-2204815, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074525} completed successfully. {{(pid=60044) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1257.398248] env[60044]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Deleted the datastore file {{(pid=60044) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1257.398248] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Deleted contents of the VM from datastore datastore2 {{(pid=60044) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1257.398419] env[60044]: DEBUG nova.virt.vmwareapi.vmops [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance destroyed {{(pid=60044) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1257.398590] env[60044]: INFO nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1257.400923] env[60044]: DEBUG nova.compute.claims [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Aborting claim: {{(pid=60044) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1257.401107] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1257.401317] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1257.462025] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fcd496-6b57-4f87-9023-a3949cea680f {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1257.469294] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-accb512a-f90c-4e8a-9dc8-a4b174bfa9b7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1257.498458] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97cb0f83-53c0-4bb8-aa6e-e841a072e187 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1257.505433] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd33e701-2864-402e-b191-8da09fdd286d {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1257.518592] env[60044]: DEBUG nova.compute.provider_tree [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1257.526727] env[60044]: DEBUG nova.scheduler.client.report [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1257.540733] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.139s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1257.541280] env[60044]: ERROR nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1257.541280] env[60044]: Faults: ['InvalidArgument'] [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Traceback (most recent call last): [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self.driver.spawn(context, instance, image_meta, [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self._fetch_image_if_missing(context, vi) [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] image_cache(vi, tmp_image_ds_loc) [ 1257.541280] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] vm_util.copy_virtual_disk( [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] session._wait_for_task(vmdk_copy_task) [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return self.wait_for_task(task_ref) [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return evt.wait() [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] result = hub.switch() [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] return self.greenlet.switch() [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1257.541589] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] self.f(*self.args, **self.kw) [ 1257.541893] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1257.541893] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] raise exceptions.translate_fault(task_info.error) [ 1257.541893] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1257.541893] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Faults: ['InvalidArgument'] [ 1257.541893] env[60044]: ERROR nova.compute.manager [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] [ 1257.542029] env[60044]: DEBUG nova.compute.utils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] VimFaultException {{(pid=60044) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1257.543340] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Build of instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b was re-scheduled: A specified parameter was not correct: fileType [ 1257.543340] env[60044]: Faults: ['InvalidArgument'] {{(pid=60044) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1257.543708] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Unplugging VIFs for instance {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1257.543875] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60044) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1257.544052] env[60044]: DEBUG nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Deallocating network for instance {{(pid=60044) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1257.544215] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] deallocate_for_instance() {{(pid=60044) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1257.798487] env[60044]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Updating instance_info_cache with network_info: [] {{(pid=60044) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1257.810506] env[60044]: INFO nova.compute.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Took 0.27 seconds to deallocate network for instance. [ 1257.888153] env[60044]: INFO nova.scheduler.client.report [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Deleted allocations for instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b [ 1257.905507] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "2d7dbbc6-07b5-4f4c-8098-d190fabc545b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.140s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1258.018999] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1258.019248] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1258.019447] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1258.019647] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager.update_available_resource {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1258.028873] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1258.029081] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1258.029245] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1258.029395] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60044) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1258.030453] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4fe6dec-bf04-4849-852c-04aea9557be2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.039104] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89c2cc3f-c0c6-4b9f-acfe-156efa0135c1 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.053137] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02dc8e6b-d4b0-4f90-8bb7-00dd718573e7 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.059357] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e89d34ff-7b87-4a93-9556-0f8284a06e66 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.087674] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181274MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60044) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1258.087821] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1258.088015] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1258.120227] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1258.120400] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60044) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1258.135194] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad10986-97c3-4b10-987a-f88c352e05c4 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.142591] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9e30a6a-777f-402d-968c-30f05c54a7ed {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.173525] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c001df27-09c4-4609-ba2c-1f739d50d5a2 {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.180766] env[60044]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-861b48dc-f8a4-4551-b43a-1c43e92248ca {{(pid=60044) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1258.193861] env[60044]: DEBUG nova.compute.provider_tree [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed in ProviderTree for provider: f00c8c1a-f294-46ac-89cc-95e9e57a7dca {{(pid=60044) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1258.201945] env[60044]: DEBUG nova.scheduler.client.report [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Inventory has not changed for provider f00c8c1a-f294-46ac-89cc-95e9e57a7dca based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60044) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1258.214056] env[60044]: DEBUG nova.compute.resource_tracker [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60044) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1258.214237] env[60044]: DEBUG oslo_concurrency.lockutils [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s {{(pid=60044) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1259.209771] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1260.018552] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1261.020058] env[60044]: DEBUG oslo_service.periodic_task [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60044) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1261.020058] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Starting heal instance info cache {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1261.020058] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Rebuilding the list of instances to heal {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1261.027930] env[60044]: DEBUG nova.compute.manager [None req-b47e0474-a32c-4692-8f1e-5ba94170d21e None None] Didn't find any instances for network info cache update. {{(pid=60044) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}}