[ 568.111776] env[67093]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 568.741339] env[67144]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 570.275677] env[67144]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67144) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.276034] env[67144]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67144) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.276158] env[67144]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67144) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.276439] env[67144]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 570.277548] env[67144]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 570.396782] env[67144]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67144) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 570.406740] env[67144]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67144) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 570.506157] env[67144]: INFO nova.virt.driver [None req-9ded2621-ab50-45f8-b21a-b75608b70286 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 570.578513] env[67144]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.578673] env[67144]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.578770] env[67144]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67144) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 573.846321] env[67144]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-392a889f-19a7-4454-953e-ae693e347d63 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.862283] env[67144]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67144) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 573.862537] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-987970d4-aada-4f5f-b2c2-37e9e73245f6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.895059] env[67144]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 82a3d. [ 573.895296] env[67144]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.317s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.895950] env[67144]: INFO nova.virt.vmwareapi.driver [None req-9ded2621-ab50-45f8-b21a-b75608b70286 None None] VMware vCenter version: 7.0.3 [ 573.899426] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83159fb6-9dcc-4d85-ac27-4fe300889277 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.917870] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd4909d6-c482-4c90-bf54-fa51b94a0b61 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.924056] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd8eda1-7d85-4e37-8e63-6c48bb967089 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.930959] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c4d3e07-5fc0-49dd-ac85-4372cf375b69 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.945649] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1299319-ccb2-416f-9925-a3ffc84d18c0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.953799] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5138f9bb-caa8-4aa3-b563-3dca431b0353 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.984679] env[67144]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-1fbecb5a-2e19-4378-8712-b2c127a3ba67 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.990233] env[67144]: DEBUG nova.virt.vmwareapi.driver [None req-9ded2621-ab50-45f8-b21a-b75608b70286 None None] Extension org.openstack.compute already exists. {{(pid=67144) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 573.993062] env[67144]: INFO nova.compute.provider_config [None req-9ded2621-ab50-45f8-b21a-b75608b70286 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 574.011253] env[67144]: DEBUG nova.context [None req-9ded2621-ab50-45f8-b21a-b75608b70286 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),08051b53-9cd2-4ac5-b5cd-b397d4bc8595(cell1) {{(pid=67144) load_cells /opt/stack/nova/nova/context.py:464}} [ 574.013723] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.013983] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.014787] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.015343] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Acquiring lock "08051b53-9cd2-4ac5-b5cd-b397d4bc8595" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.015604] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Lock "08051b53-9cd2-4ac5-b5cd-b397d4bc8595" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.016655] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Lock "08051b53-9cd2-4ac5-b5cd-b397d4bc8595" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.030307] env[67144]: DEBUG oslo_db.sqlalchemy.engines [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67144) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 574.030672] env[67144]: DEBUG oslo_db.sqlalchemy.engines [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67144) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 574.037421] env[67144]: ERROR nova.db.main.api [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.037421] env[67144]: result = function(*args, **kwargs) [ 574.037421] env[67144]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.037421] env[67144]: return func(*args, **kwargs) [ 574.037421] env[67144]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 574.037421] env[67144]: result = fn(*args, **kwargs) [ 574.037421] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 574.037421] env[67144]: return f(*args, **kwargs) [ 574.037421] env[67144]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 574.037421] env[67144]: return db.service_get_minimum_version(context, binaries) [ 574.037421] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 574.037421] env[67144]: _check_db_access() [ 574.037421] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 574.037421] env[67144]: stacktrace = ''.join(traceback.format_stack()) [ 574.037421] env[67144]: [ 574.038441] env[67144]: ERROR nova.db.main.api [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.038441] env[67144]: result = function(*args, **kwargs) [ 574.038441] env[67144]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.038441] env[67144]: return func(*args, **kwargs) [ 574.038441] env[67144]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 574.038441] env[67144]: result = fn(*args, **kwargs) [ 574.038441] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 574.038441] env[67144]: return f(*args, **kwargs) [ 574.038441] env[67144]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 574.038441] env[67144]: return db.service_get_minimum_version(context, binaries) [ 574.038441] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 574.038441] env[67144]: _check_db_access() [ 574.038441] env[67144]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 574.038441] env[67144]: stacktrace = ''.join(traceback.format_stack()) [ 574.038441] env[67144]: [ 574.039012] env[67144]: WARNING nova.objects.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Failed to get minimum service version for cell 08051b53-9cd2-4ac5-b5cd-b397d4bc8595 [ 574.039012] env[67144]: WARNING nova.objects.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 574.039381] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Acquiring lock "singleton_lock" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.039544] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Acquired lock "singleton_lock" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.039785] env[67144]: DEBUG oslo_concurrency.lockutils [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Releasing lock "singleton_lock" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.040118] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Full set of CONF: {{(pid=67144) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 574.040261] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ******************************************************************************** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 574.040390] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] Configuration options gathered from: {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 574.040523] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 574.040709] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 574.040833] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ================================================================================ {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 574.041051] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] allow_resize_to_same_host = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041225] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] arq_binding_timeout = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041354] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] backdoor_port = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041479] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] backdoor_socket = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041641] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] block_device_allocate_retries = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041799] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] block_device_allocate_retries_interval = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.041964] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cert = self.pem {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042139] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042306] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute_monitors = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042467] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] config_dir = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042633] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] config_drive_format = iso9660 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042765] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.042924] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] config_source = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043099] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] console_host = devstack {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043266] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] control_exchange = nova {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043421] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cpu_allocation_ratio = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043581] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] daemon = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043748] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] debug = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.043901] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] default_access_ip_network_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044077] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] default_availability_zone = nova {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044231] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] default_ephemeral_format = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044464] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044624] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] default_schedule_zone = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044779] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] disk_allocation_ratio = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.044938] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] enable_new_services = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045126] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] enabled_apis = ['osapi_compute'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045288] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] enabled_ssl_apis = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045446] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] flat_injected = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045630] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] force_config_drive = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045795] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] force_raw_images = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.045964] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] graceful_shutdown_timeout = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.046138] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] heal_instance_info_cache_interval = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.046350] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] host = cpu-1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.046526] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.046705] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] initial_disk_allocation_ratio = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.046871] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] initial_ram_allocation_ratio = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047090] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047256] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_build_timeout = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047413] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_delete_interval = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047576] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_format = [instance: %(uuid)s] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047741] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_name_template = instance-%08x {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.047940] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_usage_audit = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048139] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_usage_audit_period = month {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048309] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048473] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] instances_path = /opt/stack/data/nova/instances {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048641] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] internal_service_availability_zone = internal {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048794] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] key = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.048950] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] live_migration_retry_count = 30 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049121] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_config_append = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049285] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049440] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_dir = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049593] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049717] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_options = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.049873] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_rotate_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050048] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_rotate_interval_type = days {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050216] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] log_rotation_type = none {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050340] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050462] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050626] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050787] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.050912] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051081] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] long_rpc_timeout = 1800 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051241] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_concurrent_builds = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051396] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_concurrent_live_migrations = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051548] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_concurrent_snapshots = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051704] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_local_block_devices = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.051858] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_logfile_count = 30 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052025] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] max_logfile_size_mb = 200 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052174] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] maximum_instance_delete_attempts = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052338] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metadata_listen = 0.0.0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052498] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metadata_listen_port = 8775 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052666] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metadata_workers = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052822] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] migrate_max_retries = -1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.052986] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] mkisofs_cmd = genisoimage {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.053205] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] my_block_storage_ip = 10.180.1.21 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.053338] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] my_ip = 10.180.1.21 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.053501] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] network_allocate_retries = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.053676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.053842] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] osapi_compute_listen = 0.0.0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054008] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] osapi_compute_listen_port = 8774 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054183] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] osapi_compute_unique_server_name_scope = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054349] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] osapi_compute_workers = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054507] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] password_length = 12 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054665] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] periodic_enable = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054824] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] periodic_fuzzy_delay = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.054989] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] pointer_model = usbtablet {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055164] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] preallocate_images = none {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055323] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] publish_errors = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055452] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] pybasedir = /opt/stack/nova {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055633] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ram_allocation_ratio = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055799] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rate_limit_burst = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.055962] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rate_limit_except_level = CRITICAL {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056136] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rate_limit_interval = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056296] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reboot_timeout = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056463] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reclaim_instance_interval = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056620] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] record = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056790] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reimage_timeout_per_gb = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.056957] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] report_interval = 120 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057130] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rescue_timeout = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057291] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reserved_host_cpus = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reserved_host_disk_mb = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057603] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reserved_host_memory_mb = 512 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057763] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] reserved_huge_pages = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.057988] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] resize_confirm_window = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058123] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] resize_fs_using_block_device = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058284] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] resume_guests_state_on_host_boot = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058451] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058612] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rpc_response_timeout = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058771] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] run_external_periodic_tasks = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.058936] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] running_deleted_instance_action = reap {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059107] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] running_deleted_instance_poll_interval = 1800 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059266] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] running_deleted_instance_timeout = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059424] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler_instance_sync_interval = 120 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059557] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_down_time = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059725] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] servicegroup_driver = db {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.059883] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] shelved_offload_time = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] shelved_poll_interval = 3600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] shutdown_timeout = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] source_is_ipv6 = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ssl_only = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] sync_power_state_interval = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061096] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] sync_power_state_pool_size = 1000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061319] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] syslog_log_facility = LOG_USER {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061420] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] tempdir = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061532] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] timeout_nbd = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061705] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] transport_url = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.061864] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] update_resources_interval = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062033] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_cow_images = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062194] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_eventlog = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062351] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_journal = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062508] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_json = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062666] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_rootwrap_daemon = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062821] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_stderr = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.062974] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] use_syslog = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063138] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vcpu_pin_set = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063304] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plugging_is_fatal = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063480] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plugging_timeout = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063635] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] virt_mkfs = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063796] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] volume_usage_poll_interval = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.063954] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] watch_log_file = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.064128] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] web = /usr/share/spice-html5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.064312] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_concurrency.disable_process_locking = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.064615] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.064988] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.064988] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065134] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065279] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065440] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065652] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.auth_strategy = keystone {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065822] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.compute_link_prefix = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.065994] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.066178] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.dhcp_domain = novalocal {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.066349] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.enable_instance_password = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.066510] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.glance_link_prefix = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.066697] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.066874] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067045] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.instance_list_per_project_cells = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067209] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.list_records_by_skipping_down_cells = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067372] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.local_metadata_per_cell = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067541] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.max_limit = 1000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067707] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.metadata_cache_expiration = 15 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.067910] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.neutron_default_tenant_id = default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068109] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.use_forwarded_for = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068282] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.use_neutron_default_nets = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068450] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068613] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068781] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.068952] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.069132] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_dynamic_targets = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.069298] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_jsonfile_path = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.069476] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.069668] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.backend = dogpile.cache.memcached {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.069833] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.backend_argument = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070007] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.config_prefix = cache.oslo {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070184] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.dead_timeout = 60.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070348] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.debug_cache_backend = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070509] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.enable_retry_client = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070671] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.enable_socket_keepalive = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.070840] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.enabled = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071013] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.expiration_time = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071185] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.hashclient_retry_attempts = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071352] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.hashclient_retry_delay = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071517] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_dead_retry = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071688] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_password = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.071852] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072018] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072186] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_pool_maxsize = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072349] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072511] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_sasl_enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072702] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.072873] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_socket_timeout = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073053] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.memcache_username = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073223] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.proxies = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073391] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.retry_attempts = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073552] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.retry_delay = 0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073716] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.socket_keepalive_count = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.073876] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.socket_keepalive_idle = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074043] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.socket_keepalive_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074204] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.tls_allowed_ciphers = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074359] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.tls_cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074512] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.tls_certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074672] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.tls_enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.074834] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cache.tls_keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075009] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075191] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.auth_type = password {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075351] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075548] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.catalog_info = volumev3::publicURL {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075724] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.075889] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076066] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.cross_az_attach = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076234] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.debug = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076392] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.endpoint_template = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076558] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.http_retries = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076744] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.076905] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077088] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.os_region_name = RegionOne {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077255] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077415] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cinder.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077587] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077748] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.cpu_dedicated_set = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.077940] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.cpu_shared_set = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078132] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.image_type_exclude_list = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078300] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078463] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.max_concurrent_disk_ops = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078628] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.max_disk_devices_to_attach = -1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078792] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.078961] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079136] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.resource_provider_association_refresh = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079302] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.shutdown_retry_interval = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079483] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079663] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] conductor.workers = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079836] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] console.allowed_origins = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.079998] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] console.ssl_ciphers = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.080182] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] console.ssl_minimum_version = default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.080355] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] consoleauth.token_ttl = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.080523] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.080680] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.080840] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081007] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081173] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081330] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081491] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081648] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081806] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.081962] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082132] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082291] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082459] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.service_type = accelerator {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082622] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082782] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.082938] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083104] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083282] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083458] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] cyborg.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083635] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.backend = sqlalchemy {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083813] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.connection = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.083980] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.connection_debug = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.084164] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.connection_parameters = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.084328] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.connection_recycle_time = 3600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.084494] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.connection_trace = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.084659] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.db_inc_retry_interval = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.084852] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.db_max_retries = 20 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085039] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.db_max_retry_interval = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085196] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.db_retry_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085366] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.max_overflow = 50 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085551] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.max_pool_size = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085731] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.max_retries = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.085896] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.mysql_enable_ndb = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086076] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086240] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.mysql_wsrep_sync_wait = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086400] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.pool_timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086580] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.retry_interval = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086731] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.slave_connection = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.086894] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.sqlite_synchronous = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.087067] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] database.use_db_reconnect = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.087252] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.backend = sqlalchemy {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.088920] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.connection = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.089138] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.connection_debug = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.089327] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.connection_parameters = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.089503] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.connection_recycle_time = 3600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.089680] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.connection_trace = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.089848] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.db_inc_retry_interval = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090025] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.db_max_retries = 20 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090200] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.db_max_retry_interval = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090366] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.db_retry_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090538] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.max_overflow = 50 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090705] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.max_pool_size = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.090874] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.max_retries = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091048] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.mysql_enable_ndb = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091224] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091387] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091553] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.pool_timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091723] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.retry_interval = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.091884] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.slave_connection = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092061] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] api_database.sqlite_synchronous = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092245] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] devices.enabled_mdev_types = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092425] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092590] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ephemeral_storage_encryption.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092758] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.092925] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.api_servers = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093102] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093267] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093439] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093596] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093759] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.093923] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.debug = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094100] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.default_trusted_certificate_ids = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094269] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.enable_certificate_validation = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094433] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.enable_rbd_download = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094592] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094761] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.094923] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095093] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095255] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095417] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.num_retries = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095608] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.rbd_ceph_conf = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095781] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.rbd_connect_timeout = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.095952] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.rbd_pool = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096136] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.rbd_user = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096298] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096455] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096624] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.service_type = image {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096788] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.096945] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097116] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097280] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097461] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097627] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.verify_glance_signatures = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097794] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] glance.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.097986] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] guestfs.debug = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.098179] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.config_drive_cdrom = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.098345] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.config_drive_inject_password = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.098512] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.098676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.enable_instance_metrics_collection = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.098838] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.enable_remotefx = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099014] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.instances_path_share = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099184] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.iscsi_initiator_list = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099346] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.limit_cpu_features = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099511] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099678] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.099847] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.power_state_check_timeframe = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100025] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100195] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100360] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.use_multipath_io = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100526] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.volume_attach_retry_count = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100690] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.100849] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.vswitch_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.101021] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.101197] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] mks.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.101554] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.101748] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.manager_interval = 2400 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.101918] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.precache_concurrency = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102103] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.remove_unused_base_images = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102276] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102623] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] image_cache.subdirectory_name = _base {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102800] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.api_max_retries = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.102962] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.api_retry_interval = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103132] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103292] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.auth_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103600] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103764] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.103920] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104091] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104250] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104409] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104562] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104717] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.104869] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105033] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.partition_key = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105199] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.peer_list = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105354] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105546] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.serial_console_state_timeout = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105714] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.105887] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.service_type = baremetal {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.106108] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.106338] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.106537] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.106719] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.106906] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107079] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ironic.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107270] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107444] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] key_manager.fixed_key = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107628] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107796] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.barbican_api_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.107956] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.barbican_endpoint = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.108145] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.barbican_endpoint_type = public {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.108306] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.barbican_region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.number_of_retries = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.retry_delay = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.send_service_user_token = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.verify_ssl = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican.verify_ssl_path = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112499] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.auth_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] barbican_service_user.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.approle_role_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.approle_secret_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.112854] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113045] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.kv_mountpoint = secret {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113229] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.kv_version = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113297] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.namespace = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113453] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.root_token_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113620] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113779] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.ssl_ca_crt_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.113938] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114113] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.use_ssl = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114290] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114459] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114620] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114783] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.114941] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115112] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115272] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115432] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115612] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115774] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.115929] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.116107] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.116326] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.116512] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.service_type = identity {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.116685] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.116844] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117010] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117177] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117358] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117519] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] keystone.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117724] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.connection_uri = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.117915] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_mode = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.118111] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_model_extra_flags = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.118286] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_models = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.118457] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_power_governor_high = performance {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.118656] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_power_governor_low = powersave {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.118830] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_power_management = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119013] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119193] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.device_detach_attempts = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119357] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.device_detach_timeout = 20 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119522] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.disk_cachemodes = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119683] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.disk_prefix = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.119846] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.enabled_perf_events = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120024] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.file_backed_memory = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120186] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.gid_maps = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120344] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.hw_disk_discard = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120501] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.hw_machine_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120738] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_rbd_ceph_conf = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.120937] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121125] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121302] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_rbd_glance_store_name = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121471] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_rbd_pool = rbd {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121639] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_type = default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121798] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.images_volume_group = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121959] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.inject_key = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122132] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.inject_partition = -2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122293] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.inject_password = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122452] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.iscsi_iface = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122611] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.iser_use_multipath = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122781] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_bandwidth = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122937] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123109] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_downtime = 500 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123275] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123434] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123592] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_inbound_addr = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123798] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124035] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_permit_post_copy = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124222] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_scheme = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124401] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_timeout_action = abort {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124570] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_tunnelled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124734] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_uri = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124898] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.live_migration_with_native_tls = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125072] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.max_queues = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125239] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125399] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.nfs_mount_options = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125744] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125917] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126094] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_iser_scan_tries = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126258] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_memory_encrypted_guests = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126420] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126581] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_pcie_ports = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126750] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.num_volume_scan_tries = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126913] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.pmem_namespaces = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127081] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.quobyte_client_cfg = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127374] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127546] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rbd_connect_timeout = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127712] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127900] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128084] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rbd_secret_uuid = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128249] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rbd_user = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128412] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128582] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.remote_filesystem_transport = ssh {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128741] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rescue_image_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128900] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rescue_kernel_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129379] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rescue_ramdisk_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129379] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129379] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.rx_queue_size = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129544] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.smbfs_mount_options = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129819] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129989] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.snapshot_compression = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130162] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.snapshot_image_format = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130454] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130556] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.sparse_logical_volumes = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130721] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.swtpm_enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130888] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.swtpm_group = tss {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131072] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.swtpm_user = tss {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131248] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.sysinfo_serial = unique {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131413] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.tx_queue_size = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131571] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.uid_maps = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131735] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.use_virtio_for_bridges = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131903] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.virt_type = kvm {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132083] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.volume_clear = zero {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132252] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.volume_clear_size = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132420] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.volume_use_multipath = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132579] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_cache_path = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132751] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132921] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_mount_group = qemu {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133099] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_mount_opts = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133271] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133561] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133734] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.vzstorage_mount_user = stack {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133917] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134113] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134296] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.auth_type = password {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134461] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134621] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134785] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134946] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135121] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135292] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.default_floating_pool = public {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135458] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135646] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.extension_sync_interval = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135817] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.http_retries = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135980] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136156] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136316] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136485] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136646] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136813] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.ovs_bridge = br-int {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136979] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.physnets = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137163] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.region_name = RegionOne {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137336] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.service_metadata_proxy = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137497] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137666] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.service_type = network {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137849] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138038] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138204] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138362] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138542] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138705] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] neutron.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138878] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] notifications.bdms_in_notifications = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139069] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] notifications.default_level = INFO {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139311] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] notifications.notification_format = unversioned {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139523] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] notifications.notify_on_state_change = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139714] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139895] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] pci.alias = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140077] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] pci.device_spec = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140250] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] pci.report_in_placement = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140424] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140601] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.auth_type = password {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140771] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140930] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141099] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141266] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141425] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141582] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141743] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.default_domain_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141899] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.default_domain_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142067] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.domain_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142229] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.domain_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142424] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142631] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142798] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142958] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143129] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143301] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.password = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143465] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.project_domain_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143633] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.project_domain_name = Default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143798] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.project_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143994] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.project_name = service {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144195] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.region_name = RegionOne {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144371] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144546] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.service_type = placement {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144716] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144874] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145044] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145222] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.system_scope = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145384] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145574] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.trust_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145741] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.user_domain_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145911] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.user_domain_name = Default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146087] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.user_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146266] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.username = placement {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146492] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146612] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] placement.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146793] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.cores = 20 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146957] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.count_usage_from_placement = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147144] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147322] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.injected_file_content_bytes = 10240 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147492] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.injected_file_path_length = 255 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147660] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.injected_files = 5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147827] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.instances = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147994] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.key_pairs = 100 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148175] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.metadata_items = 128 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148342] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.ram = 51200 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148509] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.recheck_quota = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148676] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.server_group_members = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148846] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] quota.server_groups = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149019] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rdp.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149339] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149530] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149701] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149867] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.image_metadata_prefilter = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150042] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150214] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.max_attempts = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150378] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.max_placement_results = 1000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150540] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150704] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.query_placement_for_availability_zone = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.150862] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.query_placement_for_image_type_support = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151028] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151207] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] scheduler.workers = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151379] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151547] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151727] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151895] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152077] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152246] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152412] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152603] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152772] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.host_subset_size = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152930] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153123] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153272] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.isolated_hosts = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153434] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.isolated_images = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153619] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153786] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153946] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.pci_in_placement = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154120] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154281] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154441] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154599] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154761] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154923] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155097] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.track_instance_changes = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155275] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155444] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metrics.required = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155638] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metrics.weight_multiplier = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155808] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155972] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] metrics.weight_setting = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156297] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156491] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156699] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.port_range = 10000:20000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156880] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157063] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157238] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] serial_console.serialproxy_port = 6083 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157409] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157587] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.auth_type = password {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157752] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157912] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158089] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158251] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158407] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158579] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.send_service_user_token = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158745] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158903] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] service_user.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159088] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.agent_enabled = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159268] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159573] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159771] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159946] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.html5proxy_port = 6082 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160123] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.image_compression = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160288] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.jpeg_compression = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.playback_compression = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160618] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.server_listen = 127.0.0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160791] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160952] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.streaming_mode = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161126] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] spice.zlib_compression = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161294] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] upgrade_levels.baseapi = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161453] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] upgrade_levels.cert = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161622] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] upgrade_levels.compute = auto {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161783] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] upgrade_levels.conductor = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161948] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] upgrade_levels.scheduler = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162129] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162292] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.auth_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162449] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162606] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162770] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162931] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163097] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163263] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163418] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vendordata_dynamic_auth.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163603] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.api_retry_count = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163767] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.ca_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163940] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.cache_prefix = devstack-image-cache {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164122] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.cluster_name = testcl1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164289] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.connection_pool_size = 10 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164448] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.console_delay_seconds = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164617] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.datastore_regex = ^datastore.* {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164827] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165007] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.host_password = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165181] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.host_port = 443 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165352] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.host_username = administrator@vsphere.local {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165543] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.insecure = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165721] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.integration_bridge = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165891] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.maximum_objects = 100 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166061] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.pbm_default_policy = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166231] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.pbm_enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166392] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.pbm_wsdl_location = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166596] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166766] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.serial_port_proxy_uri = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166924] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.serial_port_service_uri = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167101] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.task_poll_interval = 0.5 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167277] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.use_linked_clone = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.vnc_keymap = en-us {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167612] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.vnc_port = 5900 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167777] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vmware.vnc_port_total = 10000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167963] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.auth_schemes = ['none'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168153] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168447] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168633] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168805] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.novncproxy_port = 6080 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168985] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.server_listen = 127.0.0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169174] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169336] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.vencrypt_ca_certs = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169494] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.vencrypt_client_cert = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169652] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vnc.vencrypt_client_key = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169830] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169995] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_deep_image_inspection = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170172] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170335] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170496] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170659] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.disable_rootwrap = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170820] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.enable_numa_live_migration = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170981] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171156] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171319] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171480] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.libvirt_disable_apic = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171642] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171802] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171964] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172139] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172302] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172463] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172621] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172782] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172942] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173116] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173303] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173475] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.client_socket_timeout = 900 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173640] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.default_pool_size = 1000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173804] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.keep_alive = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173969] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.max_header_line = 16384 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174145] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.secure_proxy_ssl_header = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174306] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.ssl_ca_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174465] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.ssl_cert_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174623] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.ssl_key_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174788] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.tcp_keepidle = 600 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174982] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175143] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] zvm.ca_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175303] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] zvm.cloud_connector_url = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175617] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175799] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] zvm.reachable_timeout = 300 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175982] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.enforce_new_defaults = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176172] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.enforce_scope = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176350] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.policy_default_rule = default {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176561] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176751] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.policy_file = policy.yaml {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176928] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177103] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177269] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177429] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177592] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177763] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177941] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178130] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.connection_string = messaging:// {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178300] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.enabled = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178468] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.es_doc_type = notification {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178632] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.es_scroll_size = 10000 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178800] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.es_scroll_time = 2m {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178961] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.filter_error_trace = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179142] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.hmac_keys = SECRET_KEY {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179309] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.sentinel_service_name = mymaster {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179479] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.socket_timeout = 0.1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179643] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] profiler.trace_sqlalchemy = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179807] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] remote_debug.host = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179966] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] remote_debug.port = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180158] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180325] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180490] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180652] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180812] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180973] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181146] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181308] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181467] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181622] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181793] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181957] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182139] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182306] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182467] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182641] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182803] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182964] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183140] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183304] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183465] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183630] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183794] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183953] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184133] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184300] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184474] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184648] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184814] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184982] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185167] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_rabbit.ssl_version = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185355] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185563] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_notifications.retry = -1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185760] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185938] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_messaging_notifications.transport_url = **** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186122] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.auth_section = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186285] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.auth_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186442] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.cafile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186633] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.certfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186799] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.collect_timing = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186956] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.connect_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187126] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.connect_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187286] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.endpoint_id = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187441] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.endpoint_override = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187600] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.insecure = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187756] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.keyfile = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187910] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.max_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188072] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.min_version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188230] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.region_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188383] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.service_name = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188541] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.service_type = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188703] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.split_loggers = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188859] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.status_code_retries = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189025] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.status_code_retry_delay = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189187] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.timeout = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189342] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.valid_interfaces = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189496] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_limit.version = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189660] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_reports.file_event_handler = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189822] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189979] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] oslo_reports.log_dir = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190161] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190318] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190474] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190640] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190803] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190960] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191142] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191299] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.group = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191455] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191618] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191780] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191936] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] vif_plug_ovs_privileged.user = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192113] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.flat_interface = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192293] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192463] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192632] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192801] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192964] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193157] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193345] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193530] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.isolate_vif = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193702] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193865] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194050] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194224] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.ovsdb_interface = native {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194388] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_vif_ovs.per_port_bridge = False {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194549] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_brick.lock_path = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194716] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194876] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195069] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.capabilities = [21] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195218] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.group = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195375] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.helper_command = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195564] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195736] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195894] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] privsep_osbrick.user = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196079] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196240] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.group = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196398] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.helper_command = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196569] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196728] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196883] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] nova_sys_admin.user = None {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197017] env[67144]: DEBUG oslo_service.service [None req-405a3200-4d77-4b37-9eaa-a981a8e340a4 None None] ******************************************************************************** {{(pid=67144) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 574.197431] env[67144]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 574.205746] env[67144]: INFO nova.virt.node [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Generated node identity 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 [ 574.205981] env[67144]: INFO nova.virt.node [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Wrote node identity 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 to /opt/stack/data/n-cpu-1/compute_id [ 574.216598] env[67144]: WARNING nova.compute.manager [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Compute nodes ['0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 574.246176] env[67144]: INFO nova.compute.manager [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 574.267067] env[67144]: WARNING nova.compute.manager [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 574.267293] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.267500] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.267643] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.267798] env[67144]: DEBUG nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 574.269325] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1b09366-0a29-4b56-80fa-18f62c0051b8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.277600] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85495853-c36b-420d-b27a-2bb81743dd36 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.291155] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92b5e678-a9dd-4557-968e-0a334b8167d4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.297182] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4519d360-f495-409c-8b60-a75ab715e994 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.325532] env[67144]: DEBUG nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181056MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 574.325665] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.325838] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.337553] env[67144]: WARNING nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] No compute node record for cpu-1:0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 could not be found. [ 574.349657] env[67144]: INFO nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 [ 574.397373] env[67144]: DEBUG nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 574.397574] env[67144]: DEBUG nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 574.500903] env[67144]: INFO nova.scheduler.client.report [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] [req-84a64922-9dec-44d9-9695-81384a7ebe9c] Created resource provider record via placement API for resource provider with UUID 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 574.516706] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c14e8a-b076-4ea2-8c40-51f65e739e3b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.523881] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8254dc46-3ba8-404d-843a-d15439e2d553 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.553167] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98834a53-a9c8-4afa-88d5-74741d7205dd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.560357] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0833f6d2-f8d8-4001-bdac-693f15da0c70 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.573129] env[67144]: DEBUG nova.compute.provider_tree [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Updating inventory in ProviderTree for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 574.607343] env[67144]: DEBUG nova.scheduler.client.report [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Updated inventory for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 574.607577] env[67144]: DEBUG nova.compute.provider_tree [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Updating resource provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 generation from 0 to 1 during operation: update_inventory {{(pid=67144) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 574.607723] env[67144]: DEBUG nova.compute.provider_tree [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Updating inventory in ProviderTree for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 574.653248] env[67144]: DEBUG nova.compute.provider_tree [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Updating resource provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 generation from 1 to 2 during operation: update_traits {{(pid=67144) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 574.670302] env[67144]: DEBUG nova.compute.resource_tracker [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 574.670502] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.670670] env[67144]: DEBUG nova.service [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Creating RPC server for service compute {{(pid=67144) start /opt/stack/nova/nova/service.py:182}} [ 574.683927] env[67144]: DEBUG nova.service [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] Join ServiceGroup membership for this service compute {{(pid=67144) start /opt/stack/nova/nova/service.py:199}} [ 574.684184] env[67144]: DEBUG nova.servicegroup.drivers.db [None req-9be0912e-891f-47ee-a8a3-f09d9492e7b4 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67144) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 586.687344] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 586.697785] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Getting list of instances from cluster (obj){ [ 586.697785] env[67144]: value = "domain-c8" [ 586.697785] env[67144]: _type = "ClusterComputeResource" [ 586.697785] env[67144]: } {{(pid=67144) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 586.698968] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93880916-21cc-4b46-804b-0f05c074a226 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.708103] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Got total of 0 instances {{(pid=67144) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 586.708323] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 586.708617] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Getting list of instances from cluster (obj){ [ 586.708617] env[67144]: value = "domain-c8" [ 586.708617] env[67144]: _type = "ClusterComputeResource" [ 586.708617] env[67144]: } {{(pid=67144) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 586.709465] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6feb0859-e1be-4511-a52d-44d2929c2fbb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.716391] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Got total of 0 instances {{(pid=67144) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 607.954419] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.954419] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.971154] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.084811] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.084970] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.086525] env[67144]: INFO nova.compute.claims [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.215466] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d6e3be-af2c-4cff-8f76-a2c331c788a7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.225814] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aed88bc6-27d1-41ce-b57b-15394d6b111d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.259800] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5f5f28f-f231-44f7-925c-43dc20559443 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.267983] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4014cd8-57be-451f-969b-372edde88e3f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.283705] env[67144]: DEBUG nova.compute.provider_tree [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.296096] env[67144]: DEBUG nova.scheduler.client.report [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.316044] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.316044] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.357127] env[67144]: DEBUG nova.compute.utils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.357754] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.358174] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.369230] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.452254] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 610.386584] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 610.386885] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 610.386987] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 610.387190] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 610.387338] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 610.387488] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 610.387706] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 610.387876] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 610.388230] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 610.388403] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 610.388571] env[67144]: DEBUG nova.virt.hardware [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 610.389498] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5d9db22-ffd9-4b4e-8b2a-7117addb6e80 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.398250] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6347c3fb-e5bc-4674-b0f3-701e55d68496 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.416236] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d14c7309-0e30-4087-9dd5-4f7705aca937 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.491558] env[67144]: DEBUG nova.policy [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c255c724bd654f80b46b6cff35628a6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c40cd63bee774882826bb27f77af4999', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 610.969449] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Successfully created port: c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 611.634955] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "54af505e-0f30-4848-bd14-04461db40664" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.634955] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "54af505e-0f30-4848-bd14-04461db40664" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.652402] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 611.728835] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.731943] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.733456] env[67144]: INFO nova.compute.claims [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 611.852728] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b81d68f-a559-4098-9975-c586803c0d0c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.860755] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7264e13c-68f7-45d5-94f7-f430297b5657 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.893526] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87c0fef1-a53e-4021-bbc2-1ed56b3c73f2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.901551] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf4aed7-0046-417f-ae91-90587f8f9d39 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.918877] env[67144]: DEBUG nova.compute.provider_tree [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.928513] env[67144]: DEBUG nova.scheduler.client.report [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.951530] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.951932] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 611.992030] env[67144]: DEBUG nova.compute.utils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.993367] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 611.993698] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 612.008564] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 612.092896] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 612.126302] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 612.126911] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 612.127185] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 612.127590] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 612.127885] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 612.128144] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 612.128457] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 612.128772] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 612.129150] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 612.129499] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 612.129777] env[67144]: DEBUG nova.virt.hardware [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 612.130823] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13d075cf-3e97-46ec-aa59-f3cb9dce8f16 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.141852] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6de0959-7f53-4d03-93ca-f173e3dc1a8a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.147603] env[67144]: DEBUG nova.policy [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9f2ad11bf1bb490f8361ec1ea42bc702', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7396c78ebebb4a65acb57e089170eb97', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 612.560556] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Successfully created port: 15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 614.060099] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Successfully updated port: c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.075068] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.076316] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquired lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.076643] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.187228] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.500607] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Updating instance_info_cache with network_info: [{"id": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "address": "fa:16:3e:81:64:4a", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7c530d2-6a", "ovs_interfaceid": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.523783] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Releasing lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.524467] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Instance network_info: |[{"id": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "address": "fa:16:3e:81:64:4a", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7c530d2-6a", "ovs_interfaceid": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 614.527909] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:81:64:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 614.546143] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 614.546758] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-227b1223-de2f-4475-a0a4-9752f1bce290 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.561996] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Created folder: OpenStack in parent group-v4. [ 614.562445] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating folder: Project (c40cd63bee774882826bb27f77af4999). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 614.562445] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8f4ad3e1-f167-47e0-ac88-840402c6d106 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.573873] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Created folder: Project (c40cd63bee774882826bb27f77af4999) in parent group-v572613. [ 614.573873] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating folder: Instances. Parent ref: group-v572614. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 614.574033] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c29161b2-3282-4ba0-ae31-909cef6123bb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.584600] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Created folder: Instances in parent group-v572614. [ 614.584945] env[67144]: DEBUG oslo.service.loopingcall [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 614.585046] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 614.585252] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cdf8b8a2-b9a1-4c8a-ba1d-8a55fb9a7e4a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.614184] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 614.614184] env[67144]: value = "task-2847992" [ 614.614184] env[67144]: _type = "Task" [ 614.614184] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 614.625449] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847992, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 614.873069] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Successfully updated port: 15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.886211] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.886472] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquired lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.886541] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.953678] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 615.127635] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847992, 'name': CreateVM_Task, 'duration_secs': 0.315614} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 615.127897] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.140204] env[67144]: DEBUG oslo_vmware.service [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec088839-8a87-4d66-a525-95c8eed7713a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.149186] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.149704] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.150768] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 615.150993] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b3f0966-3fcb-43a1-a8ed-6c8f378fa463 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.157178] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Waiting for the task: (returnval){ [ 615.157178] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529ae833-886f-57a3-ff86-033ad8e2eaf8" [ 615.157178] env[67144]: _type = "Task" [ 615.157178] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.166524] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529ae833-886f-57a3-ff86-033ad8e2eaf8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.282633] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Updating instance_info_cache with network_info: [{"id": "15c37861-6f14-4266-b774-76cd03efa607", "address": "fa:16:3e:e3:cc:b4", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15c37861-6f", "ovs_interfaceid": "15c37861-6f14-4266-b774-76cd03efa607", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.304373] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Releasing lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.304373] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Instance network_info: |[{"id": "15c37861-6f14-4266-b774-76cd03efa607", "address": "fa:16:3e:e3:cc:b4", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15c37861-6f", "ovs_interfaceid": "15c37861-6f14-4266-b774-76cd03efa607", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 615.304557] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e3:cc:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '15c37861-6f14-4266-b774-76cd03efa607', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 615.313946] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Creating folder: Project (7396c78ebebb4a65acb57e089170eb97). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.314787] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b7f66da0-87ef-4d48-a637-e43b495a7612 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.327270] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Created folder: Project (7396c78ebebb4a65acb57e089170eb97) in parent group-v572613. [ 615.327531] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Creating folder: Instances. Parent ref: group-v572617. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.327853] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-88c880b1-b2f7-402b-8c9a-926ad7c7cd97 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.338504] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Created folder: Instances in parent group-v572617. [ 615.338828] env[67144]: DEBUG oslo.service.loopingcall [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 615.338958] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54af505e-0f30-4848-bd14-04461db40664] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 615.339185] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-572b885b-8b16-4295-a413-784bf990f0a5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.366652] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 615.366652] env[67144]: value = "task-2847995" [ 615.366652] env[67144]: _type = "Task" [ 615.366652] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.373714] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847995, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.547725] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "b04052f8-b29f-4b32-b249-02b83d3d77f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.547998] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "b04052f8-b29f-4b32-b249-02b83d3d77f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.561989] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 615.628236] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.628490] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.630268] env[67144]: INFO nova.compute.claims [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 615.668397] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.668717] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 615.668993] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.669271] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.672018] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 615.672018] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af72e625-5df6-4142-8191-c0040ee482e6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.679128] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 615.679424] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 615.680785] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3534e966-e469-484a-94dc-21c706f76580 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.691385] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a938dc9d-9616-427e-a11f-84c57658eae7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.696952] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Waiting for the task: (returnval){ [ 615.696952] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52194313-11e2-ca85-063b-324237c3c0c0" [ 615.696952] env[67144]: _type = "Task" [ 615.696952] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.704884] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52194313-11e2-ca85-063b-324237c3c0c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.747802] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1558ed54-87ad-4e3c-badf-79dc7250238c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.755471] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-895baf96-058c-438d-a243-ce0d2ed1611c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.794655] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07210fa6-f61a-4bc2-b241-9af6b89edef2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.804797] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9cee80-8934-4c2d-891d-3aa7a9be3a68 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.820379] env[67144]: DEBUG nova.compute.provider_tree [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 615.833386] env[67144]: DEBUG nova.scheduler.client.report [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 615.847584] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.848111] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 615.873779] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847995, 'name': CreateVM_Task, 'duration_secs': 0.30441} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 615.873912] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54af505e-0f30-4848-bd14-04461db40664] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.874607] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.875000] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.875114] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 615.875304] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ffc604ea-199d-4514-b143-143ffce18133 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.880585] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Waiting for the task: (returnval){ [ 615.880585] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5252b2f7-5f61-27ea-7996-27e5070f2535" [ 615.880585] env[67144]: _type = "Task" [ 615.880585] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.893102] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5252b2f7-5f61-27ea-7996-27e5070f2535, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.898302] env[67144]: DEBUG nova.compute.utils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 615.900388] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 615.900451] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 615.911362] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 616.009096] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 616.036265] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 616.036512] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 616.036680] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 616.036882] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 616.037187] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 616.037417] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 616.037602] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 616.037758] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 616.037920] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 616.038339] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 616.038339] env[67144]: DEBUG nova.virt.hardware [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 616.039133] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ab7c2ae-c1e9-4a0e-af85-42a27ab15d6a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.047436] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc4710be-8892-43d1-9dc7-9a5c879dbae2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.160997] env[67144]: DEBUG nova.policy [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7fd8488bfa074a7cbf9d56536b690b1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b2c1e53a6c6f44958175b6d128c00596', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 616.215028] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 616.215028] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating directory with path [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 616.215028] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af03c829-792b-403d-9347-5c46e54148b8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.235404] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Created directory with path [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 616.235626] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Fetch image to [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 616.235824] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 616.236643] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dc823d4-3a35-45ac-9375-5c14c09cee57 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.249033] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6954b6d6-51c3-437b-bca9-f7c67137b235 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.260036] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bddc27d6-3a56-424d-8340-d07143271f3c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.295255] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e25ccbc-520a-443f-8b0f-8bc90a14f19a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.302463] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a0f69ab4-0205-4f04-b823-389e292acd81 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.324238] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 616.392030] env[67144]: DEBUG nova.compute.manager [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Received event network-vif-plugged-c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 616.392030] env[67144]: DEBUG oslo_concurrency.lockutils [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] Acquiring lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.392921] env[67144]: DEBUG oslo_concurrency.lockutils [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] Lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.393339] env[67144]: DEBUG oslo_concurrency.lockutils [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] Lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.394229] env[67144]: DEBUG nova.compute.manager [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] No waiting events found dispatching network-vif-plugged-c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 616.394229] env[67144]: WARNING nova.compute.manager [req-337a8e93-7aa4-40e6-bba1-6ad0999e3bec req-3e72c831-7389-4d90-9c7a-4dfbf5d43278 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Received unexpected event network-vif-plugged-c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a for instance with vm_state building and task_state spawning. [ 616.404094] env[67144]: DEBUG oslo_vmware.rw_handles [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 616.405768] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.406360] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 616.406599] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.474060] env[67144]: DEBUG oslo_vmware.rw_handles [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 616.474551] env[67144]: DEBUG oslo_vmware.rw_handles [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 616.687459] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Successfully created port: 123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 617.028145] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "99cbc3d9-8c82-4a32-8adb-59572bab2eca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.028399] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "99cbc3d9-8c82-4a32-8adb-59572bab2eca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.041027] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 617.109562] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.109729] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.111204] env[67144]: INFO nova.compute.claims [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 617.256706] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b4b1d85-e4e5-4c5e-814d-9cd7a2ae5eba {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.266167] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4901ada6-6c93-441d-a443-1895d96c1454 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.297912] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627d46c8-b6ba-45b9-b14c-bd86bbddf08e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.306389] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f62218-d309-4209-9dcb-44ed71195503 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.320511] env[67144]: DEBUG nova.compute.provider_tree [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.332614] env[67144]: DEBUG nova.scheduler.client.report [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.351094] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.351678] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 617.393239] env[67144]: DEBUG nova.compute.utils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 617.394946] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Not allocating networking since 'none' was specified. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 617.405960] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 617.489661] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 617.520885] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 617.520885] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 617.520885] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 617.521106] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 617.521231] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 617.521307] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 617.521532] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 617.521667] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 617.521834] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 617.521984] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 617.522173] env[67144]: DEBUG nova.virt.hardware [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 617.523160] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d97772e2-941a-407e-8517-7a83e1d44a38 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.532421] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca355d11-424c-4ebe-be7a-ea4e5c119c55 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.547799] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Instance VIF info [] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 617.553417] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Creating folder: Project (b95b8cee9b3c4ff18fc94014e3a3b349). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 617.554262] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8a2abd39-d206-46b3-a6c1-3a343314c579 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.565238] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Created folder: Project (b95b8cee9b3c4ff18fc94014e3a3b349) in parent group-v572613. [ 617.565828] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Creating folder: Instances. Parent ref: group-v572620. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 617.565828] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5cf881b4-c3de-416e-b083-8099a39f95ea {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.577902] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Created folder: Instances in parent group-v572620. [ 617.578188] env[67144]: DEBUG oslo.service.loopingcall [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 617.578384] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 617.578581] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-558eefe9-d466-4fc7-bfff-d56c7e04c05e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.595851] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 617.595851] env[67144]: value = "task-2847998" [ 617.595851] env[67144]: _type = "Task" [ 617.595851] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 617.604520] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847998, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 618.110057] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847998, 'name': CreateVM_Task, 'duration_secs': 0.264954} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 618.110233] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 618.110844] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.111104] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.111443] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 618.111688] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4dd0b6c6-1f87-4332-87ae-5400469e7d55 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.120896] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for the task: (returnval){ [ 618.120896] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52015a92-ee99-7293-66fc-d5db9ffaa4e4" [ 618.120896] env[67144]: _type = "Task" [ 618.120896] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 618.132072] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52015a92-ee99-7293-66fc-d5db9ffaa4e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 618.252503] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.252503] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.280523] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 618.349776] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.350037] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.351815] env[67144]: INFO nova.compute.claims [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.535152] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0f721e-bb1f-4fcd-b6c7-2cb21760a04e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.547086] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de1aa0ec-a2dd-4ee1-9c7d-ca25b8f2bd3b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.584178] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-399fa262-ad19-4f0f-be3e-12c3f23aca67 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.591306] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d26675-06d6-4e1a-870b-f9cc7135eeba {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.605112] env[67144]: DEBUG nova.compute.provider_tree [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.619062] env[67144]: DEBUG nova.scheduler.client.report [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.634629] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 618.634954] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 618.635195] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.644288] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.644799] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 618.687592] env[67144]: DEBUG nova.compute.utils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 618.688879] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 618.689398] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 618.702137] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 618.788763] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 618.822257] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 618.822508] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 618.822668] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 618.822851] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 618.822990] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 618.826335] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 618.826629] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 618.826827] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 618.827045] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 618.827249] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 618.827526] env[67144]: DEBUG nova.virt.hardware [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 618.828944] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94839318-6e7a-4662-90ab-66eb12535479 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.841225] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f44360e5-0265-4f04-a5fd-b83b4ec25772 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.022524] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Successfully updated port: 123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 619.040197] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.040343] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquired lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.040490] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 619.055560] env[67144]: DEBUG nova.policy [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abab09e0241a4e948c00bfec55b66874', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1996e6e4309b4af8b8fce4f844903eb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 619.211800] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.287516] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "c2d5335a-4332-4828-855d-380cdea64a1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.287852] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "c2d5335a-4332-4828-855d-380cdea64a1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.309539] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 619.370761] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.371043] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.372599] env[67144]: INFO nova.compute.claims [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 619.543157] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2199a7c-77a9-447e-82b2-916f0dc3597b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.551487] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1cd650a-01d5-488b-af7e-83fc83454634 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.586770] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5129f95-82da-485d-af7f-c5147ae5a85d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.594700] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-873cfced-8428-4344-875c-fbed23a149b3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.609234] env[67144]: DEBUG nova.compute.provider_tree [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.625576] env[67144]: DEBUG nova.scheduler.client.report [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.647632] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.648164] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 619.697804] env[67144]: DEBUG nova.compute.utils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.699103] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 619.699277] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 619.710223] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 619.789268] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 619.820911] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.821084] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.821157] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.821298] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.821443] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.821587] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.821799] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.821968] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.827895] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.827895] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.829798] env[67144]: DEBUG nova.virt.hardware [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.829798] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a9d933c-efcd-45da-90a7-9817e0efea21 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.841113] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c232eac-e66b-466f-bcad-f825fef4ed9b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.867138] env[67144]: DEBUG nova.policy [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bca0226d36648fc8bc370f16b62f1a4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38e6d6ab2a79447bb038b72c6787028f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 620.057039] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "5bb4c082-f5fc-42e6-891a-4866eef1add6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.057039] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "5bb4c082-f5fc-42e6-891a-4866eef1add6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.066479] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 620.135209] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.135469] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.136980] env[67144]: INFO nova.compute.claims [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.279266] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Updating instance_info_cache with network_info: [{"id": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "address": "fa:16:3e:88:54:f4", "network": {"id": "06dc3446-bc0e-488d-9aef-13f79682ef85", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1175299775-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2c1e53a6c6f44958175b6d128c00596", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap123b0146-c5", "ovs_interfaceid": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.307503] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Releasing lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.307835] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance network_info: |[{"id": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "address": "fa:16:3e:88:54:f4", "network": {"id": "06dc3446-bc0e-488d-9aef-13f79682ef85", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1175299775-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2c1e53a6c6f44958175b6d128c00596", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap123b0146-c5", "ovs_interfaceid": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 620.308854] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:88:54:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a071ecf4-e713-4f97-9271-8c17952f6dee', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '123b0146-c529-4dd6-800b-2e7bbbcb716b', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 620.320221] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Creating folder: Project (b2c1e53a6c6f44958175b6d128c00596). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 620.323893] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-82e083eb-9221-4461-be79-8082532b0467 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.335887] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Created folder: Project (b2c1e53a6c6f44958175b6d128c00596) in parent group-v572613. [ 620.335887] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Creating folder: Instances. Parent ref: group-v572623. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 620.336268] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c08a8304-25cf-4fed-8f83-72c8e5a81b9a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.395022] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98cbb84-cc02-48c5-827f-fb77897f27d0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.401276] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Created folder: Instances in parent group-v572623. [ 620.401514] env[67144]: DEBUG oslo.service.loopingcall [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 620.402828] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 620.403237] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-73afc384-04f7-49c2-a5ea-032b9ed6a123 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.430140] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5903dbac-047f-46e7-b598-4750020bcd6b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.435365] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 620.435365] env[67144]: value = "task-2848001" [ 620.435365] env[67144]: _type = "Task" [ 620.435365] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 620.471126] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ed7ce3-f96c-4b0e-9b39-d11d35079541 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.478469] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848001, 'name': CreateVM_Task} progress is 25%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 620.483814] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cb6d0a-42df-4aab-9f0d-c844a4087ac5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.504782] env[67144]: DEBUG nova.compute.provider_tree [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.516168] env[67144]: DEBUG nova.scheduler.client.report [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.542821] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.543365] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 620.595092] env[67144]: DEBUG nova.compute.utils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 620.598030] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 620.598030] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 620.619043] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.702152] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Successfully created port: 618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.738294] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.766924] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.767600] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.767600] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.767727] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.768125] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.768125] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.768260] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.768418] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 620.768578] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 620.768742] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 620.769410] env[67144]: DEBUG nova.virt.hardware [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 620.769779] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87f65d51-f7ff-4a58-a432-886e93c0b127 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.777844] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d0fe642-a1d4-4f94-b826-af83e650d9b2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.922085] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Received event network-changed-c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 620.922085] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Refreshing instance network info cache due to event network-changed-c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 620.922388] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquiring lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.922486] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquired lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 620.922649] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Refreshing network info cache for port c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 620.953341] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848001, 'name': CreateVM_Task, 'duration_secs': 0.281155} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 620.953341] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 620.953830] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.953996] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 620.954727] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 620.955282] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-83af0ba6-c3af-4342-a076-76aa209e370d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.960311] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Waiting for the task: (returnval){ [ 620.960311] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5230577c-9d04-03e7-ca1c-857beb6a472e" [ 620.960311] env[67144]: _type = "Task" [ 620.960311] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 620.969922] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5230577c-9d04-03e7-ca1c-857beb6a472e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 621.065990] env[67144]: DEBUG nova.policy [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f3ca7172990b4bfa969656af161bb68d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '94f84f525e854f298d263a5ac213b025', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 621.428942] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Successfully created port: a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 621.478552] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.479517] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 621.479517] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.998027] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Updated VIF entry in instance network info cache for port c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 621.999235] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Updating instance_info_cache with network_info: [{"id": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "address": "fa:16:3e:81:64:4a", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.173", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc7c530d2-6a", "ovs_interfaceid": "c7c530d2-6a94-44e7-8fa7-4c7cfc7c475a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.015949] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Releasing lock "refresh_cache-668949c5-1c0b-46a5-a0bc-5406f774b2e3" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.015949] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Received event network-vif-plugged-15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 622.015949] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquiring lock "54af505e-0f30-4848-bd14-04461db40664-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.015949] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Lock "54af505e-0f30-4848-bd14-04461db40664-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.016345] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Lock "54af505e-0f30-4848-bd14-04461db40664-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.016345] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] No waiting events found dispatching network-vif-plugged-15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 622.016345] env[67144]: WARNING nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Received unexpected event network-vif-plugged-15c37861-6f14-4266-b774-76cd03efa607 for instance with vm_state building and task_state spawning. [ 622.016345] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Received event network-changed-15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 622.016512] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Refreshing instance network info cache due to event network-changed-15c37861-6f14-4266-b774-76cd03efa607. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 622.016512] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquiring lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.016512] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquired lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 622.016512] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Refreshing network info cache for port 15c37861-6f14-4266-b774-76cd03efa607 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 622.936034] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Updated VIF entry in instance network info cache for port 15c37861-6f14-4266-b774-76cd03efa607. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 622.936034] env[67144]: DEBUG nova.network.neutron [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: 54af505e-0f30-4848-bd14-04461db40664] Updating instance_info_cache with network_info: [{"id": "15c37861-6f14-4266-b774-76cd03efa607", "address": "fa:16:3e:e3:cc:b4", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15c37861-6f", "ovs_interfaceid": "15c37861-6f14-4266-b774-76cd03efa607", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.949781] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Releasing lock "refresh_cache-54af505e-0f30-4848-bd14-04461db40664" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.949781] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Received event network-vif-plugged-123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 622.949781] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Acquiring lock "b04052f8-b29f-4b32-b249-02b83d3d77f9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.949781] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Lock "b04052f8-b29f-4b32-b249-02b83d3d77f9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.950048] env[67144]: DEBUG oslo_concurrency.lockutils [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] Lock "b04052f8-b29f-4b32-b249-02b83d3d77f9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.950048] env[67144]: DEBUG nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] No waiting events found dispatching network-vif-plugged-123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 622.950048] env[67144]: WARNING nova.compute.manager [req-918b5bbb-5ad9-40ae-98b8-235c1c5e21fa req-9121c7ff-0418-42e2-a04f-8a5f0091d0a3 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Received unexpected event network-vif-plugged-123b0146-c529-4dd6-800b-2e7bbbcb716b for instance with vm_state building and task_state spawning. [ 623.093631] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Successfully updated port: 618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 623.114268] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 623.114553] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 623.118917] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 623.147219] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.147219] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.169594] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 623.198965] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Successfully created port: 062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 623.244462] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.244767] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.246735] env[67144]: INFO nova.compute.claims [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 623.368353] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 623.679106] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6bac3cc-3c40-49f3-9a5f-1ab070ba8bae {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.692797] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b1c0490-e3c1-4744-9272-beba34919271 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.731796] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b26fcbc-2c8d-47d7-897f-7fbbf6d3531c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.739581] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a9d02a-b80f-4fec-abbb-f1744088bb41 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.747346] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Updating instance_info_cache with network_info: [{"id": "618495ef-fa31-4a5f-bc87-1e975278e852", "address": "fa:16:3e:c3:2e:07", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap618495ef-fa", "ovs_interfaceid": "618495ef-fa31-4a5f-bc87-1e975278e852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.760471] env[67144]: DEBUG nova.compute.provider_tree [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 623.763696] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.764032] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance network_info: |[{"id": "618495ef-fa31-4a5f-bc87-1e975278e852", "address": "fa:16:3e:c3:2e:07", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap618495ef-fa", "ovs_interfaceid": "618495ef-fa31-4a5f-bc87-1e975278e852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 623.765193] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:2e:07', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '618495ef-fa31-4a5f-bc87-1e975278e852', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 623.777580] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating folder: Project (38e6d6ab2a79447bb038b72c6787028f). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.777960] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a68433f-917a-4911-98fb-9ab21157cd98 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.782261] env[67144]: DEBUG nova.scheduler.client.report [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 623.794705] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created folder: Project (38e6d6ab2a79447bb038b72c6787028f) in parent group-v572613. [ 623.795059] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating folder: Instances. Parent ref: group-v572626. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.795137] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c9444240-759c-4f7a-98eb-453ac0a37c0b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.805594] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.560s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.805901] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 623.811913] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created folder: Instances in parent group-v572626. [ 623.811913] env[67144]: DEBUG oslo.service.loopingcall [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 623.811913] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 623.812156] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-71330c4f-4fc6-4638-a8f6-e5993511c1e1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.833783] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 623.833783] env[67144]: value = "task-2848004" [ 623.833783] env[67144]: _type = "Task" [ 623.833783] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 623.847695] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848004, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 623.877252] env[67144]: DEBUG nova.compute.utils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 623.879071] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 623.879252] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 623.897443] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 623.982473] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 624.021129] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 624.021129] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 624.021129] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 624.021258] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 624.021258] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 624.021258] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 624.021398] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 624.021770] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 624.022328] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 624.025074] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 624.025074] env[67144]: DEBUG nova.virt.hardware [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 624.025074] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a88ac068-61a6-4435-a56e-1c42aad93b72 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.034319] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e55217-b471-4c44-85c0-d9065597e31f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.115086] env[67144]: DEBUG nova.policy [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d8fec0371b84a58977e0d95075e728d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c9e8de2ee1c4f38b1b30dcd1c7ecd46', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 624.348626] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848004, 'name': CreateVM_Task, 'duration_secs': 0.329961} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 624.348803] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 624.349521] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.349686] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.350185] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 624.350452] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f97e271-96f9-4f47-974f-13b51cbc648c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.356630] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 624.356630] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5255d063-b47f-acb6-bbde-e9de91a67670" [ 624.356630] env[67144]: _type = "Task" [ 624.356630] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 624.367473] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5255d063-b47f-acb6-bbde-e9de91a67670, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 624.733337] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Successfully created port: e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.795206] env[67144]: DEBUG nova.compute.manager [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Received event network-vif-plugged-618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 624.795625] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] Acquiring lock "c2d5335a-4332-4828-855d-380cdea64a1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.796138] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] Lock "c2d5335a-4332-4828-855d-380cdea64a1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.796138] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] Lock "c2d5335a-4332-4828-855d-380cdea64a1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.796435] env[67144]: DEBUG nova.compute.manager [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] No waiting events found dispatching network-vif-plugged-618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 624.796502] env[67144]: WARNING nova.compute.manager [req-bd798b76-d4f9-4e98-9fd4-b06e5ee526d1 req-f016ad3c-eaa4-4894-9ecc-0d5e9c13fe33 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Received unexpected event network-vif-plugged-618495ef-fa31-4a5f-bc87-1e975278e852 for instance with vm_state building and task_state spawning. [ 624.878829] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 624.879359] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 624.879359] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.909012] env[67144]: DEBUG nova.compute.manager [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Received event network-changed-123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 624.909657] env[67144]: DEBUG nova.compute.manager [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Refreshing instance network info cache due to event network-changed-123b0146-c529-4dd6-800b-2e7bbbcb716b. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 624.910037] env[67144]: DEBUG oslo_concurrency.lockutils [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] Acquiring lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.912304] env[67144]: DEBUG oslo_concurrency.lockutils [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] Acquired lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.912304] env[67144]: DEBUG nova.network.neutron [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Refreshing network info cache for port 123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 625.333514] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Successfully updated port: a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 625.345231] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.345383] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquired lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.345555] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.475131] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.162594] env[67144]: DEBUG nova.network.neutron [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Updated VIF entry in instance network info cache for port 123b0146-c529-4dd6-800b-2e7bbbcb716b. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 626.162929] env[67144]: DEBUG nova.network.neutron [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Updating instance_info_cache with network_info: [{"id": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "address": "fa:16:3e:88:54:f4", "network": {"id": "06dc3446-bc0e-488d-9aef-13f79682ef85", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1175299775-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2c1e53a6c6f44958175b6d128c00596", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap123b0146-c5", "ovs_interfaceid": "123b0146-c529-4dd6-800b-2e7bbbcb716b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.174956] env[67144]: DEBUG oslo_concurrency.lockutils [req-50f00362-633d-477a-9ec6-b06e2922cb63 req-bc1668f6-d18c-4cac-be91-405557c56fa1 service nova] Releasing lock "refresh_cache-b04052f8-b29f-4b32-b249-02b83d3d77f9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.445020] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Updating instance_info_cache with network_info: [{"id": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "address": "fa:16:3e:15:8d:29", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7100a4b-e3", "ovs_interfaceid": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.456008] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Releasing lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.456315] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance network_info: |[{"id": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "address": "fa:16:3e:15:8d:29", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7100a4b-e3", "ovs_interfaceid": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 626.456699] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:15:8d:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a7100a4b-e3a9-4a93-91f1-054a28c8a5f5', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 626.465751] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Creating folder: Project (1996e6e4309b4af8b8fce4f844903eb8). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 626.470793] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35471a70-23a6-43db-832a-cb23cd2e80c3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.480338] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Created folder: Project (1996e6e4309b4af8b8fce4f844903eb8) in parent group-v572613. [ 626.480727] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Creating folder: Instances. Parent ref: group-v572629. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 626.481091] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-daa0e66d-fce7-4997-a4ed-21a513be1264 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.493120] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Created folder: Instances in parent group-v572629. [ 626.493120] env[67144]: DEBUG oslo.service.loopingcall [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 626.493120] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 626.493120] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-677d4e78-5742-4ca8-a270-3d39e1c9e982 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.512929] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 626.512929] env[67144]: value = "task-2848007" [ 626.512929] env[67144]: _type = "Task" [ 626.512929] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 626.521143] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848007, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.702214] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Successfully updated port: 062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 626.715050] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.715598] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquired lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.715958] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 626.877440] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.949833] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Successfully updated port: e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 626.964541] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.964541] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquired lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.964541] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.028271] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848007, 'name': CreateVM_Task, 'duration_secs': 0.3137} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 627.029405] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.031342] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 627.031997] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.032170] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.032473] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 627.034662] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-448c9ed8-5877-43bf-b0bf-3e1ac310aac8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.037926] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Waiting for the task: (returnval){ [ 627.037926] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52dbfc01-29b1-a73b-4ced-191a24707808" [ 627.037926] env[67144]: _type = "Task" [ 627.037926] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 627.046657] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52dbfc01-29b1-a73b-4ced-191a24707808, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 627.311554] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Updating instance_info_cache with network_info: [{"id": "062893e1-cc24-4478-a285-0cabddeb2f43", "address": "fa:16:3e:8e:5d:99", "network": {"id": "9ee498a0-0c8d-4c25-a238-915af0df4afe", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-801673838-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94f84f525e854f298d263a5ac213b025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap062893e1-cc", "ovs_interfaceid": "062893e1-cc24-4478-a285-0cabddeb2f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.332609] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Releasing lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.332747] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance network_info: |[{"id": "062893e1-cc24-4478-a285-0cabddeb2f43", "address": "fa:16:3e:8e:5d:99", "network": {"id": "9ee498a0-0c8d-4c25-a238-915af0df4afe", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-801673838-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94f84f525e854f298d263a5ac213b025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap062893e1-cc", "ovs_interfaceid": "062893e1-cc24-4478-a285-0cabddeb2f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 627.333441] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8e:5d:99', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7abeeabc-351d-404c-ada6-6a7305667707', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '062893e1-cc24-4478-a285-0cabddeb2f43', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 627.341721] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Creating folder: Project (94f84f525e854f298d263a5ac213b025). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.341721] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a7af7f3c-2ea5-4da4-a903-215167dd4d0b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.355019] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Created folder: Project (94f84f525e854f298d263a5ac213b025) in parent group-v572613. [ 627.355019] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Creating folder: Instances. Parent ref: group-v572632. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.355019] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-deabd2ea-a077-437d-acf7-e37d26777da6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.367242] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Created folder: Instances in parent group-v572632. [ 627.367242] env[67144]: DEBUG oslo.service.loopingcall [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 627.367242] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 627.367242] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ff535839-b926-4e84-8c1a-f78d094df84f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.387789] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 627.387789] env[67144]: value = "task-2848010" [ 627.387789] env[67144]: _type = "Task" [ 627.387789] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 627.395723] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848010, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 627.424494] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Updating instance_info_cache with network_info: [{"id": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "address": "fa:16:3e:0e:1c:ed", "network": {"id": "b0624a8f-c335-4b74-8bb5-6047d512470b", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-282087725-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c9e8de2ee1c4f38b1b30dcd1c7ecd46", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9732690c-bdcf-4e6f-9a32-42c196333eb8", "external-id": "nsx-vlan-transportzone-548", "segmentation_id": 548, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5b2eff0-fb", "ovs_interfaceid": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.442991] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Releasing lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.443363] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance network_info: |[{"id": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "address": "fa:16:3e:0e:1c:ed", "network": {"id": "b0624a8f-c335-4b74-8bb5-6047d512470b", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-282087725-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c9e8de2ee1c4f38b1b30dcd1c7ecd46", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9732690c-bdcf-4e6f-9a32-42c196333eb8", "external-id": "nsx-vlan-transportzone-548", "segmentation_id": 548, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5b2eff0-fb", "ovs_interfaceid": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 627.444098] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:1c:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9732690c-bdcf-4e6f-9a32-42c196333eb8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e5b2eff0-fba6-4e1f-95eb-90b424e9b00a', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 627.452959] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Creating folder: Project (3c9e8de2ee1c4f38b1b30dcd1c7ecd46). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.453278] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9c90da97-f3c8-4460-b6af-a4691af107fd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.466196] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Created folder: Project (3c9e8de2ee1c4f38b1b30dcd1c7ecd46) in parent group-v572613. [ 627.466196] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Creating folder: Instances. Parent ref: group-v572635. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 627.466196] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fcb83b23-57c4-4154-b029-232c7d61684f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.476910] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Created folder: Instances in parent group-v572635. [ 627.477582] env[67144]: DEBUG oslo.service.loopingcall [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 627.478185] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 627.478512] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-15f11da5-d883-4033-ad92-d1015060e120 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.500740] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 627.500740] env[67144]: value = "task-2848013" [ 627.500740] env[67144]: _type = "Task" [ 627.500740] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 627.513181] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848013, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 627.551870] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.552137] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 627.552367] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.902420] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848010, 'name': CreateVM_Task, 'duration_secs': 0.298087} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 627.902557] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 627.903242] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.903390] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.903711] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 627.903978] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6dc0a5d1-225a-4d70-b515-28d0ecad65c8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.911036] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Waiting for the task: (returnval){ [ 627.911036] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529396bb-a3b4-5bf3-3421-b2a3b08bf113" [ 627.911036] env[67144]: _type = "Task" [ 627.911036] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 627.925272] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529396bb-a3b4-5bf3-3421-b2a3b08bf113, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 628.014420] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848013, 'name': CreateVM_Task, 'duration_secs': 0.346926} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 628.014600] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 628.015244] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.421669] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.421924] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 628.422150] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.422615] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.423668] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 628.423668] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cebdf220-a92f-4946-88b0-03cc85e6459b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.428965] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for the task: (returnval){ [ 628.428965] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526c6f2a-2a28-bc9d-a37f-9d605ec1a73c" [ 628.428965] env[67144]: _type = "Task" [ 628.428965] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 628.436027] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526c6f2a-2a28-bc9d-a37f-9d605ec1a73c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 628.698743] env[67144]: DEBUG nova.compute.manager [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Received event network-changed-618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 628.699048] env[67144]: DEBUG nova.compute.manager [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Refreshing instance network info cache due to event network-changed-618495ef-fa31-4a5f-bc87-1e975278e852. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 628.699087] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Acquiring lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.699235] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Acquired lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.699394] env[67144]: DEBUG nova.network.neutron [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Refreshing network info cache for port 618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 628.941249] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.941249] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 628.941249] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.189078] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Received event network-vif-plugged-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.189321] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquiring lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.189526] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.189691] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.189945] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] No waiting events found dispatching network-vif-plugged-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 629.190235] env[67144]: WARNING nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Received unexpected event network-vif-plugged-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 for instance with vm_state building and task_state spawning. [ 629.190490] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Received event network-changed-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.190490] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Refreshing instance network info cache due to event network-changed-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 629.190672] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquiring lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.190768] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquired lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.190905] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Refreshing network info cache for port a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 629.314056] env[67144]: DEBUG nova.network.neutron [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Updated VIF entry in instance network info cache for port 618495ef-fa31-4a5f-bc87-1e975278e852. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 629.314056] env[67144]: DEBUG nova.network.neutron [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Updating instance_info_cache with network_info: [{"id": "618495ef-fa31-4a5f-bc87-1e975278e852", "address": "fa:16:3e:c3:2e:07", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap618495ef-fa", "ovs_interfaceid": "618495ef-fa31-4a5f-bc87-1e975278e852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.327386] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Releasing lock "refresh_cache-c2d5335a-4332-4828-855d-380cdea64a1a" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.327386] env[67144]: DEBUG nova.compute.manager [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Received event network-vif-plugged-e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.327386] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Acquiring lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.327386] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.327633] env[67144]: DEBUG oslo_concurrency.lockutils [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.327633] env[67144]: DEBUG nova.compute.manager [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] No waiting events found dispatching network-vif-plugged-e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 629.327633] env[67144]: WARNING nova.compute.manager [req-d784a154-def5-4609-960a-18ce76f49eb9 req-f45f4d6b-20a4-45a0-b15c-9e6872055a12 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Received unexpected event network-vif-plugged-e5b2eff0-fba6-4e1f-95eb-90b424e9b00a for instance with vm_state building and task_state spawning. [ 629.834236] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.834521] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.849068] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 629.902985] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Updated VIF entry in instance network info cache for port a7100a4b-e3a9-4a93-91f1-054a28c8a5f5. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 629.902985] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Updating instance_info_cache with network_info: [{"id": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "address": "fa:16:3e:15:8d:29", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.199", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7100a4b-e3", "ovs_interfaceid": "a7100a4b-e3a9-4a93-91f1-054a28c8a5f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.918028] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Releasing lock "refresh_cache-6cbf4358-dcfa-471b-ae1a-e6a512c47d26" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.918028] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Received event network-vif-plugged-062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.918028] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquiring lock "5bb4c082-f5fc-42e6-891a-4866eef1add6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.918028] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Lock "5bb4c082-f5fc-42e6-891a-4866eef1add6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.918379] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Lock "5bb4c082-f5fc-42e6-891a-4866eef1add6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.918379] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] No waiting events found dispatching network-vif-plugged-062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 629.918379] env[67144]: WARNING nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Received unexpected event network-vif-plugged-062893e1-cc24-4478-a285-0cabddeb2f43 for instance with vm_state building and task_state spawning. [ 629.918379] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Received event network-changed-062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.918488] env[67144]: DEBUG nova.compute.manager [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Refreshing instance network info cache due to event network-changed-062893e1-cc24-4478-a285-0cabddeb2f43. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 629.918488] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquiring lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.918488] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Acquired lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.918488] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Refreshing network info cache for port 062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 629.921077] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.921482] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.923113] env[67144]: INFO nova.compute.claims [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 630.172552] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d7f656d-bd41-4555-9a05-3c28b3ad5b87 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.182138] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c60e5e7e-87d5-4782-a20a-2ef0f8470518 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.216149] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4551bb7-6900-47b0-95c1-f8aa45026ff0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.225935] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e443d5-6415-4235-9fb2-bc997298b4a3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.240550] env[67144]: DEBUG nova.compute.provider_tree [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 630.250393] env[67144]: DEBUG nova.scheduler.client.report [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 630.265700] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.267205] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 630.312793] env[67144]: DEBUG nova.compute.utils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 630.313481] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 630.318389] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 630.325554] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 630.401294] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 630.424844] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.427491] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 630.427864] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 630.427864] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 630.428070] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 630.428264] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 630.428428] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 630.428639] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 630.428799] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 630.429079] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 630.429141] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 630.429309] env[67144]: DEBUG nova.virt.hardware [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 630.429584] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.429784] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 630.429890] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 630.433872] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13eeaafa-854c-4495-b855-4a7684600f54 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.442716] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e96d091-b69d-460e-9f5e-8ab60885c762 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.460307] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.460469] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 54af505e-0f30-4848-bd14-04461db40664] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.460605] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.460734] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.460933] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.460995] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.461111] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.461229] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.461374] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.461498] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 630.462742] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Updated VIF entry in instance network info cache for port 062893e1-cc24-4478-a285-0cabddeb2f43. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 630.463098] env[67144]: DEBUG nova.network.neutron [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Updating instance_info_cache with network_info: [{"id": "062893e1-cc24-4478-a285-0cabddeb2f43", "address": "fa:16:3e:8e:5d:99", "network": {"id": "9ee498a0-0c8d-4c25-a238-915af0df4afe", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-801673838-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94f84f525e854f298d263a5ac213b025", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7abeeabc-351d-404c-ada6-6a7305667707", "external-id": "nsx-vlan-transportzone-9", "segmentation_id": 9, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap062893e1-cc", "ovs_interfaceid": "062893e1-cc24-4478-a285-0cabddeb2f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.464822] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.465203] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.465414] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.465610] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.465828] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.466166] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.466239] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 630.466357] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.473570] env[67144]: DEBUG oslo_concurrency.lockutils [req-efd6bc19-d10f-4465-8048-074c3b8a1088 req-fbe6fcfa-3ecc-4590-b32a-d98e3834d919 service nova] Releasing lock "refresh_cache-5bb4c082-f5fc-42e6-891a-4866eef1add6" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.478366] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.478574] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.478863] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.479050] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 630.480298] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bd4918-5aa3-4e3f-9042-45e00f728b90 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.490208] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b150fc-8c24-4e30-8144-ac35a0aa7012 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.508916] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d726eedb-6200-44f2-a6af-4b2f7099dc26 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.515862] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd91825-818d-4e02-87a9-2e4c044ee712 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.549134] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181084MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 630.549134] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.549266] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.561862] env[67144]: DEBUG nova.policy [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c06c69f45c6944f79e3a4034a0d6c6ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2d73b3c73df451d8fe558fd35ecc55e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 630.619420] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 668949c5-1c0b-46a5-a0bc-5406f774b2e3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.619742] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 54af505e-0f30-4848-bd14-04461db40664 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.619936] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b04052f8-b29f-4b32-b249-02b83d3d77f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620128] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 99cbc3d9-8c82-4a32-8adb-59572bab2eca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620305] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620475] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c2d5335a-4332-4828-855d-380cdea64a1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620638] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620818] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.620986] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.621250] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 630.621972] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 630.758840] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f23e313-5fd1-410d-b216-39731883d7e8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.766362] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de1b7950-55c6-4b29-ac5b-dd966c022bd2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.799708] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28cb5caa-04af-4bdc-ac2f-17c6989551da {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.807226] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6c54589-70a7-44fa-92d6-6de69a259a18 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.820647] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 630.829053] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 630.848493] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 630.848893] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.299s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.581632] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Successfully created port: 2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.610470] env[67144]: DEBUG nova.compute.manager [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Received event network-changed-e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 631.610553] env[67144]: DEBUG nova.compute.manager [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Refreshing instance network info cache due to event network-changed-e5b2eff0-fba6-4e1f-95eb-90b424e9b00a. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 631.610772] env[67144]: DEBUG oslo_concurrency.lockutils [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] Acquiring lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.610916] env[67144]: DEBUG oslo_concurrency.lockutils [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] Acquired lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.612330] env[67144]: DEBUG nova.network.neutron [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Refreshing network info cache for port e5b2eff0-fba6-4e1f-95eb-90b424e9b00a {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 632.578658] env[67144]: DEBUG nova.network.neutron [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Updated VIF entry in instance network info cache for port e5b2eff0-fba6-4e1f-95eb-90b424e9b00a. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 632.580232] env[67144]: DEBUG nova.network.neutron [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Updating instance_info_cache with network_info: [{"id": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "address": "fa:16:3e:0e:1c:ed", "network": {"id": "b0624a8f-c335-4b74-8bb5-6047d512470b", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-282087725-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c9e8de2ee1c4f38b1b30dcd1c7ecd46", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9732690c-bdcf-4e6f-9a32-42c196333eb8", "external-id": "nsx-vlan-transportzone-548", "segmentation_id": 548, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape5b2eff0-fb", "ovs_interfaceid": "e5b2eff0-fba6-4e1f-95eb-90b424e9b00a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.596156] env[67144]: DEBUG oslo_concurrency.lockutils [req-0b1fc5fa-8892-43af-9454-446363e3458b req-30102fb8-b26e-4ecf-a099-86a701664191 service nova] Releasing lock "refresh_cache-b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.312465] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Successfully updated port: 2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 634.330150] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.330150] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquired lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.330309] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 634.500158] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.440033] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Updating instance_info_cache with network_info: [{"id": "2a298409-f7b8-4281-af41-45621f42e1e5", "address": "fa:16:3e:a8:47:17", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a298409-f7", "ovs_interfaceid": "2a298409-f7b8-4281-af41-45621f42e1e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.456153] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Releasing lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.456452] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance network_info: |[{"id": "2a298409-f7b8-4281-af41-45621f42e1e5", "address": "fa:16:3e:a8:47:17", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a298409-f7", "ovs_interfaceid": "2a298409-f7b8-4281-af41-45621f42e1e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 635.456974] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a8:47:17', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2a298409-f7b8-4281-af41-45621f42e1e5', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 635.467728] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Creating folder: Project (d2d73b3c73df451d8fe558fd35ecc55e). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.468138] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87b326a7-aa0c-42a5-bf9f-27321384a300 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.483426] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Created folder: Project (d2d73b3c73df451d8fe558fd35ecc55e) in parent group-v572613. [ 635.486273] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Creating folder: Instances. Parent ref: group-v572638. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.486273] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d70e0ef-69f1-44a9-970c-536d931b4410 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.495164] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Created folder: Instances in parent group-v572638. [ 635.495414] env[67144]: DEBUG oslo.service.loopingcall [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.495676] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 635.495786] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-150a7079-a829-4d21-9e2a-654384cfa774 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.527865] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 635.527865] env[67144]: value = "task-2848016" [ 635.527865] env[67144]: _type = "Task" [ 635.527865] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 635.539088] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848016, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.042220] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848016, 'name': CreateVM_Task, 'duration_secs': 0.308037} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 636.042485] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 636.043592] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.043856] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.044186] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 636.044246] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5034adb5-2c76-42b1-b9f3-5d71205b7a97 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.049058] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for the task: (returnval){ [ 636.049058] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526cdfe1-4594-f82c-de21-7a8d763ebc62" [ 636.049058] env[67144]: _type = "Task" [ 636.049058] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.059861] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526cdfe1-4594-f82c-de21-7a8d763ebc62, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.560746] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.561114] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 636.561388] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.685345] env[67144]: DEBUG nova.compute.manager [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Received event network-vif-plugged-2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 636.685345] env[67144]: DEBUG oslo_concurrency.lockutils [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] Acquiring lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.685345] env[67144]: DEBUG oslo_concurrency.lockutils [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.685345] env[67144]: DEBUG oslo_concurrency.lockutils [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.686200] env[67144]: DEBUG nova.compute.manager [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] No waiting events found dispatching network-vif-plugged-2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 636.686617] env[67144]: WARNING nova.compute.manager [req-ccb03378-8302-44e1-b95a-4609f4f33d32 req-0f5cf10a-3652-4d8f-89b0-79e3171f7d0e service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Received unexpected event network-vif-plugged-2a298409-f7b8-4281-af41-45621f42e1e5 for instance with vm_state building and task_state spawning. [ 642.155778] env[67144]: DEBUG nova.compute.manager [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Received event network-changed-2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 642.156125] env[67144]: DEBUG nova.compute.manager [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Refreshing instance network info cache due to event network-changed-2a298409-f7b8-4281-af41-45621f42e1e5. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 642.156225] env[67144]: DEBUG oslo_concurrency.lockutils [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] Acquiring lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.156365] env[67144]: DEBUG oslo_concurrency.lockutils [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] Acquired lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 642.156517] env[67144]: DEBUG nova.network.neutron [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Refreshing network info cache for port 2a298409-f7b8-4281-af41-45621f42e1e5 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 643.408165] env[67144]: DEBUG nova.network.neutron [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Updated VIF entry in instance network info cache for port 2a298409-f7b8-4281-af41-45621f42e1e5. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 643.408540] env[67144]: DEBUG nova.network.neutron [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Updating instance_info_cache with network_info: [{"id": "2a298409-f7b8-4281-af41-45621f42e1e5", "address": "fa:16:3e:a8:47:17", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.205", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a298409-f7", "ovs_interfaceid": "2a298409-f7b8-4281-af41-45621f42e1e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.427142] env[67144]: DEBUG oslo_concurrency.lockutils [req-2f8c4ce0-d404-4453-b8b6-9dab25a240a5 req-60960d20-6196-4a02-a2f8-97cc006dd06c service nova] Releasing lock "refresh_cache-ca7b7941-c016-4968-9beb-f8c094ca16cd" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 664.755955] env[67144]: WARNING oslo_vmware.rw_handles [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 664.755955] env[67144]: ERROR oslo_vmware.rw_handles [ 664.755955] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 664.765987] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 664.765987] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Copying Virtual Disk [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/2092258b-0b05-407b-a66c-31f8aa075834/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 664.765987] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5c6733de-3a4f-4032-add8-6cc704a2db89 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.772525] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Waiting for the task: (returnval){ [ 664.772525] env[67144]: value = "task-2848027" [ 664.772525] env[67144]: _type = "Task" [ 664.772525] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 664.788031] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Task: {'id': task-2848027, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 665.285892] env[67144]: DEBUG oslo_vmware.exceptions [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 665.287106] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 665.291569] env[67144]: ERROR nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.291569] env[67144]: Faults: ['InvalidArgument'] [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Traceback (most recent call last): [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] yield resources [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self.driver.spawn(context, instance, image_meta, [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self._fetch_image_if_missing(context, vi) [ 665.291569] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] image_cache(vi, tmp_image_ds_loc) [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] vm_util.copy_virtual_disk( [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] session._wait_for_task(vmdk_copy_task) [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return self.wait_for_task(task_ref) [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return evt.wait() [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] result = hub.switch() [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 665.291970] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return self.greenlet.switch() [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self.f(*self.args, **self.kw) [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] raise exceptions.translate_fault(task_info.error) [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Faults: ['InvalidArgument'] [ 665.292579] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] [ 665.292579] env[67144]: INFO nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Terminating instance [ 665.292579] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 665.293305] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 665.293305] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5158577d-9ace-4f64-ac23-892e214f24b8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.297932] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 665.298096] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 665.298946] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb5e052c-207f-44e2-994f-4fd463a29bb6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.307043] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 665.307342] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f81abe85-62be-49c9-83d6-e5ab4ec35957 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.315458] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 665.315679] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 665.316875] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d66964ef-270c-4eba-9ebe-23dec0598433 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.325671] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Waiting for the task: (returnval){ [ 665.325671] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52d45a51-c65c-7ee2-e815-26bd54bc5912" [ 665.325671] env[67144]: _type = "Task" [ 665.325671] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.336758] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52d45a51-c65c-7ee2-e815-26bd54bc5912, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 665.383726] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 665.383951] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 665.383951] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Deleting the datastore file [datastore1] 668949c5-1c0b-46a5-a0bc-5406f774b2e3 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 665.384640] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4067eb3f-d1cf-4882-bda2-acbed1408cfb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.393347] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Waiting for the task: (returnval){ [ 665.393347] env[67144]: value = "task-2848029" [ 665.393347] env[67144]: _type = "Task" [ 665.393347] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.403345] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Task: {'id': task-2848029, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 665.841154] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 665.841430] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Creating directory with path [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 665.841831] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e1104e06-d2ed-4963-a009-a0fcb5078790 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.857048] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Created directory with path [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 665.861021] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Fetch image to [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 665.861228] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 665.862049] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aed9a812-1c73-4b0a-84a6-e8496559ecbe {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.872909] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7eecfd4-7c49-42c0-bc0c-6f06cddc0749 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.884116] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8d3d3ad-4c97-477f-b386-5ca1ccd5f0f9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.924464] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f92f82d-6163-4bec-8709-08039a225b13 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.935638] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3f1ab166-fa25-4fb7-9deb-8aa9130221f2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.941371] env[67144]: DEBUG oslo_vmware.api [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Task: {'id': task-2848029, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069015} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 665.941371] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 665.941371] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 665.941371] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 665.941371] env[67144]: INFO nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Took 0.64 seconds to destroy the instance on the hypervisor. [ 665.948314] env[67144]: DEBUG nova.compute.claims [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 665.948314] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.949074] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.996770] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 666.083784] env[67144]: DEBUG oslo_vmware.rw_handles [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 666.159957] env[67144]: DEBUG oslo_vmware.rw_handles [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 666.159957] env[67144]: DEBUG oslo_vmware.rw_handles [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 666.263652] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6717c6b-25bd-4f7c-8ab1-8e115777259e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.273330] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f868b134-39ce-480c-809d-c0ce8f05f02c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.311859] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79882516-83a6-4eb5-b0ae-2f8ac6f3ebe6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.327262] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb07d6b-c43d-4643-b4fe-224ac4360c98 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.343112] env[67144]: DEBUG nova.compute.provider_tree [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.355446] env[67144]: DEBUG nova.scheduler.client.report [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.386466] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.435s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.386466] env[67144]: ERROR nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.386466] env[67144]: Faults: ['InvalidArgument'] [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Traceback (most recent call last): [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self.driver.spawn(context, instance, image_meta, [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 666.386466] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self._fetch_image_if_missing(context, vi) [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] image_cache(vi, tmp_image_ds_loc) [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] vm_util.copy_virtual_disk( [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] session._wait_for_task(vmdk_copy_task) [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return self.wait_for_task(task_ref) [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return evt.wait() [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] result = hub.switch() [ 666.387019] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] return self.greenlet.switch() [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] self.f(*self.args, **self.kw) [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] raise exceptions.translate_fault(task_info.error) [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Faults: ['InvalidArgument'] [ 666.387804] env[67144]: ERROR nova.compute.manager [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] [ 666.387804] env[67144]: DEBUG nova.compute.utils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] VimFaultException {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 666.392561] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Build of instance 668949c5-1c0b-46a5-a0bc-5406f774b2e3 was re-scheduled: A specified parameter was not correct: fileType [ 666.392561] env[67144]: Faults: ['InvalidArgument'] {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 666.392561] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 666.392561] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 666.392561] env[67144]: DEBUG nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 666.392854] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 667.932243] env[67144]: DEBUG nova.network.neutron [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.946880] env[67144]: INFO nova.compute.manager [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] [instance: 668949c5-1c0b-46a5-a0bc-5406f774b2e3] Took 1.56 seconds to deallocate network for instance. [ 668.078222] env[67144]: INFO nova.scheduler.client.report [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Deleted allocations for instance 668949c5-1c0b-46a5-a0bc-5406f774b2e3 [ 668.110268] env[67144]: DEBUG oslo_concurrency.lockutils [None req-555c12b7-84d7-43d4-ac16-c39c32a4d71f tempest-ServerDiagnosticsNegativeTest-842278041 tempest-ServerDiagnosticsNegativeTest-842278041-project-member] Lock "668949c5-1c0b-46a5-a0bc-5406f774b2e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 60.156s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.831722] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.868164] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.868164] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 690.868164] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 690.889815] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 54af505e-0f30-4848-bd14-04461db40664] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.889815] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.889815] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.890104] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.890104] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.891116] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.891116] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.891116] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 690.891116] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 690.891116] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.891517] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 690.891517] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.913223] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.913414] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.913562] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.913718] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 690.916428] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee4e1f0b-b217-4869-bced-e4836f35b068 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.927373] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef296c0f-3b6a-4b5d-91ab-dc28d415f43f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.943032] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88ce8ee8-f778-4377-b800-92a51587bee1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.950899] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1506548e-1f2a-4248-91ff-fd320d103328 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.997129] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181056MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 690.997341] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.997849] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.105970] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 54af505e-0f30-4848-bd14-04461db40664 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108056] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b04052f8-b29f-4b32-b249-02b83d3d77f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108056] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 99cbc3d9-8c82-4a32-8adb-59572bab2eca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108056] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108056] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c2d5335a-4332-4828-855d-380cdea64a1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108204] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108204] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108204] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.108204] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 691.108301] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 691.276618] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68db2f8c-973d-41f6-8f75-a8c14b95fbae {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.290687] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68751f24-f8bd-458e-b25e-e91152f5b6a9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.325989] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b6dc8e8-32a5-4d71-b0c5-325b2811ec05 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.335160] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd4dc07-faa8-4ba0-a266-190690b94288 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.361031] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.371960] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.404362] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 691.404588] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.407s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.930660] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.930934] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.931086] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.931233] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 692.416996] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 692.419297] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 701.062721] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "f61f525f-70a5-402f-bf52-0bd4041b907f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.062959] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.075310] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 701.138182] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.138555] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.141462] env[67144]: INFO nova.compute.claims [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 701.371962] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932bd7a8-b993-4ac6-a785-9bead024bee4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.380762] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65cc92b8-3567-4e05-94b0-6d3c87f092d3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.420954] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1c90b10-569f-48bd-8039-caca48386b3c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.429704] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05f51860-1f6c-4584-baec-21a442a68b35 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.443934] env[67144]: DEBUG nova.compute.provider_tree [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.452961] env[67144]: DEBUG nova.scheduler.client.report [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.468375] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.468891] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 701.507313] env[67144]: DEBUG nova.compute.utils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 701.508908] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 701.508908] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 701.517819] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 701.598188] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 701.624884] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 701.625251] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 701.625448] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 701.625640] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 701.625788] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 701.626128] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 701.626396] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 701.626563] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 701.626761] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 701.626935] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 701.627130] env[67144]: DEBUG nova.virt.hardware [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 701.628409] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d93db7c4-5854-454b-9063-36eeeabc84d1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.637458] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19afaea9-91e6-44ca-bfec-ea9b19f88d86 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.725766] env[67144]: DEBUG nova.policy [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2b37ebba094b4844b841a40202c3532f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d75445bcda7473ba3ae33ebf292a0c3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 703.170035] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Successfully created port: 8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 703.702521] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "d4eaa8fd-84b5-47a2-832a-9106187bc531" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.702755] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "d4eaa8fd-84b5-47a2-832a-9106187bc531" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.714272] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 703.790009] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.790009] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.794797] env[67144]: INFO nova.compute.claims [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 703.981503] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.981736] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.048140] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93aed457-5ead-47fb-8470-2db56ad21190 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.057794] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f684e816-d226-452a-af37-fe59cb428292 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.100541] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f7aa436-19f4-48e0-8896-7e605daae789 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.112601] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b88df5e-7879-4f2a-81fc-13b48e19dba6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.129921] env[67144]: DEBUG nova.compute.provider_tree [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.138736] env[67144]: DEBUG nova.scheduler.client.report [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.152765] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.153495] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 704.198253] env[67144]: DEBUG nova.compute.utils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.200334] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 704.202394] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 704.211466] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 704.280906] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 704.303690] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 704.303940] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 704.304111] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 704.304290] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 704.304432] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 704.304574] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 704.304815] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 704.304959] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 704.305148] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 704.306311] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 704.306311] env[67144]: DEBUG nova.virt.hardware [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 704.306311] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-479cf49b-8ab1-49af-a3a8-68ceda250f75 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.314902] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8bd584-d7bd-480c-8289-78ed6d4ed63a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.337592] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Successfully created port: 44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 704.447246] env[67144]: DEBUG nova.policy [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31415b760f2448969c8852d5723d1e34', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '033e7dd3e529475d8a2f3983deab741b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 705.025451] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Successfully created port: 66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 705.775739] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Successfully updated port: 8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 706.416388] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Successfully updated port: 66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 706.431413] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.432098] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquired lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 706.432304] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.512908] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.724718] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Updating instance_info_cache with network_info: [{"id": "66332029-9ce1-424d-9899-20f64e4d004b", "address": "fa:16:3e:4b:9b:1a", "network": {"id": "a7f83c2a-4e2e-47bf-990d-1bead0e106e6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1323607046-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "033e7dd3e529475d8a2f3983deab741b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e238ac23-819b-452f-9015-52922e45efd3", "external-id": "nsx-vlan-transportzone-127", "segmentation_id": 127, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66332029-9c", "ovs_interfaceid": "66332029-9ce1-424d-9899-20f64e4d004b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.736690] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Releasing lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 706.737241] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance network_info: |[{"id": "66332029-9ce1-424d-9899-20f64e4d004b", "address": "fa:16:3e:4b:9b:1a", "network": {"id": "a7f83c2a-4e2e-47bf-990d-1bead0e106e6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1323607046-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "033e7dd3e529475d8a2f3983deab741b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e238ac23-819b-452f-9015-52922e45efd3", "external-id": "nsx-vlan-transportzone-127", "segmentation_id": 127, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66332029-9c", "ovs_interfaceid": "66332029-9ce1-424d-9899-20f64e4d004b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 706.739477] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4b:9b:1a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e238ac23-819b-452f-9015-52922e45efd3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '66332029-9ce1-424d-9899-20f64e4d004b', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 706.747738] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Creating folder: Project (033e7dd3e529475d8a2f3983deab741b). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 706.748279] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-97148175-f12d-40bf-a97c-176a04162079 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.761273] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Created folder: Project (033e7dd3e529475d8a2f3983deab741b) in parent group-v572613. [ 706.761477] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Creating folder: Instances. Parent ref: group-v572649. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 706.762012] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e3c348d-1032-4979-a571-09547b797360 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.775507] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Created folder: Instances in parent group-v572649. [ 706.776020] env[67144]: DEBUG oslo.service.loopingcall [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 706.776334] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 706.776571] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e30880d2-cebd-47be-9697-faefdc212fae {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.801261] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 706.801261] env[67144]: value = "task-2848044" [ 706.801261] env[67144]: _type = "Task" [ 706.801261] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 706.810634] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848044, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 707.317026] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848044, 'name': CreateVM_Task, 'duration_secs': 0.353599} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 707.317026] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 707.317026] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.317026] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.317026] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 707.317291] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1981bcb2-6d41-438b-8c4b-db690b9590ff {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.326166] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Waiting for the task: (returnval){ [ 707.326166] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5238516a-36c6-4ab5-eb4b-56caec4e9d74" [ 707.326166] env[67144]: _type = "Task" [ 707.326166] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 707.335090] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5238516a-36c6-4ab5-eb4b-56caec4e9d74, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 707.841521] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.841932] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 707.842055] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.978773] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "0811722e-2ae9-4018-a85d-ab4fe5f46370" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.979203] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "0811722e-2ae9-4018-a85d-ab4fe5f46370" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.011345] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Successfully updated port: 44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 708.029075] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.029330] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquired lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.029330] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 708.068980] env[67144]: DEBUG nova.compute.manager [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-vif-plugged-8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 708.068980] env[67144]: DEBUG oslo_concurrency.lockutils [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] Acquiring lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.069253] env[67144]: DEBUG oslo_concurrency.lockutils [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.069361] env[67144]: DEBUG oslo_concurrency.lockutils [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.070064] env[67144]: DEBUG nova.compute.manager [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] No waiting events found dispatching network-vif-plugged-8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 708.070064] env[67144]: WARNING nova.compute.manager [req-c69d96c5-f30a-4627-be7d-57b4fc463219 req-657e459c-556f-4981-ac45-5919b6bada18 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received unexpected event network-vif-plugged-8e006206-1f62-42fa-b1da-025935a88d27 for instance with vm_state building and task_state spawning. [ 708.089187] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.168508] env[67144]: DEBUG nova.compute.manager [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Received event network-vif-plugged-66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 708.168508] env[67144]: DEBUG oslo_concurrency.lockutils [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] Acquiring lock "d4eaa8fd-84b5-47a2-832a-9106187bc531-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.168793] env[67144]: DEBUG oslo_concurrency.lockutils [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] Lock "d4eaa8fd-84b5-47a2-832a-9106187bc531-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.169023] env[67144]: DEBUG oslo_concurrency.lockutils [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] Lock "d4eaa8fd-84b5-47a2-832a-9106187bc531-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.169334] env[67144]: DEBUG nova.compute.manager [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] No waiting events found dispatching network-vif-plugged-66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 708.169470] env[67144]: WARNING nova.compute.manager [req-31135c86-1c2f-4cdf-b25a-e29832942f0c req-41d3606a-368c-425a-8db9-ae6b7dded480 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Received unexpected event network-vif-plugged-66332029-9ce1-424d-9899-20f64e4d004b for instance with vm_state building and task_state spawning. [ 709.416160] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updating instance_info_cache with network_info: [{"id": "8e006206-1f62-42fa-b1da-025935a88d27", "address": "fa:16:3e:f5:0c:2e", "network": {"id": "27a55d29-4601-4852-b0d8-40b8547f86ef", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1311409486", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.192", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e006206-1f", "ovs_interfaceid": "8e006206-1f62-42fa-b1da-025935a88d27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "address": "fa:16:3e:48:aa:43", "network": {"id": "6088d26c-e076-4a2a-aadb-1af04680cb18", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-902518023", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44714ba6-ad", "ovs_interfaceid": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.430481] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Releasing lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.431594] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance network_info: |[{"id": "8e006206-1f62-42fa-b1da-025935a88d27", "address": "fa:16:3e:f5:0c:2e", "network": {"id": "27a55d29-4601-4852-b0d8-40b8547f86ef", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1311409486", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.192", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e006206-1f", "ovs_interfaceid": "8e006206-1f62-42fa-b1da-025935a88d27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "address": "fa:16:3e:48:aa:43", "network": {"id": "6088d26c-e076-4a2a-aadb-1af04680cb18", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-902518023", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44714ba6-ad", "ovs_interfaceid": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 709.432812] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f5:0c:2e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '39a4aca0-934b-4a91-8779-6a4360c3f967', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8e006206-1f62-42fa-b1da-025935a88d27', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:aa:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6fb0104-186b-4288-b87e-634893f46f01', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '44714ba6-ad01-48a3-bfe7-d65dc34dd361', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 709.444148] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Creating folder: Project (9d75445bcda7473ba3ae33ebf292a0c3). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.445107] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c33f0b2d-06ca-4090-9527-29bccf4d31d1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.460193] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Created folder: Project (9d75445bcda7473ba3ae33ebf292a0c3) in parent group-v572613. [ 709.460193] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Creating folder: Instances. Parent ref: group-v572652. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.460193] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-387ccd1d-9e8a-4b27-bc04-6e13cdff65a3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.473510] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Created folder: Instances in parent group-v572652. [ 709.473747] env[67144]: DEBUG oslo.service.loopingcall [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 709.473937] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 709.474174] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fe958d33-88f5-40ed-b9e7-072c5da7969f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.506648] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 709.506648] env[67144]: value = "task-2848047" [ 709.506648] env[67144]: _type = "Task" [ 709.506648] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 709.515727] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848047, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 710.023088] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848047, 'name': CreateVM_Task, 'duration_secs': 0.398432} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 710.023431] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 710.025382] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.026773] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.026773] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 710.026773] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c20584d5-be3c-4f3f-a9db-b3c2182a5ada {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.040819] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Waiting for the task: (returnval){ [ 710.040819] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5229ed95-47ab-09fa-fc8d-0804fed2c22c" [ 710.040819] env[67144]: _type = "Task" [ 710.040819] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 710.053370] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5229ed95-47ab-09fa-fc8d-0804fed2c22c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 710.556757] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.556757] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 710.556757] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.159330] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "42ce3afe-e725-4688-b048-bd6721c22c35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.159632] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "42ce3afe-e725-4688-b048-bd6721c22c35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.203097] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Acquiring lock "48037468-8c60-4449-8297-46eadab5246e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.203357] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Lock "48037468-8c60-4449-8297-46eadab5246e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.194619] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-changed-8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 712.194871] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Refreshing instance network info cache due to event network-changed-8e006206-1f62-42fa-b1da-025935a88d27. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 712.199116] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Acquiring lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.199354] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Acquired lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.199526] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Refreshing network info cache for port 8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 712.222400] env[67144]: DEBUG nova.compute.manager [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Received event network-changed-66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 712.222700] env[67144]: DEBUG nova.compute.manager [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Refreshing instance network info cache due to event network-changed-66332029-9ce1-424d-9899-20f64e4d004b. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 712.222955] env[67144]: DEBUG oslo_concurrency.lockutils [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] Acquiring lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.223090] env[67144]: DEBUG oslo_concurrency.lockutils [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] Acquired lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.223225] env[67144]: DEBUG nova.network.neutron [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Refreshing network info cache for port 66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 712.514420] env[67144]: DEBUG nova.network.neutron [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Updated VIF entry in instance network info cache for port 66332029-9ce1-424d-9899-20f64e4d004b. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 712.514562] env[67144]: DEBUG nova.network.neutron [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Updating instance_info_cache with network_info: [{"id": "66332029-9ce1-424d-9899-20f64e4d004b", "address": "fa:16:3e:4b:9b:1a", "network": {"id": "a7f83c2a-4e2e-47bf-990d-1bead0e106e6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1323607046-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "033e7dd3e529475d8a2f3983deab741b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e238ac23-819b-452f-9015-52922e45efd3", "external-id": "nsx-vlan-transportzone-127", "segmentation_id": 127, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap66332029-9c", "ovs_interfaceid": "66332029-9ce1-424d-9899-20f64e4d004b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.537429] env[67144]: DEBUG oslo_concurrency.lockutils [req-12bc3579-cee6-4c68-b5aa-e40fb73c5a93 req-425f67af-d27a-4f17-8cb3-54aa8324d291 service nova] Releasing lock "refresh_cache-d4eaa8fd-84b5-47a2-832a-9106187bc531" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.719016] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updated VIF entry in instance network info cache for port 8e006206-1f62-42fa-b1da-025935a88d27. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 712.719016] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updating instance_info_cache with network_info: [{"id": "8e006206-1f62-42fa-b1da-025935a88d27", "address": "fa:16:3e:f5:0c:2e", "network": {"id": "27a55d29-4601-4852-b0d8-40b8547f86ef", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1311409486", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.192", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e006206-1f", "ovs_interfaceid": "8e006206-1f62-42fa-b1da-025935a88d27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "address": "fa:16:3e:48:aa:43", "network": {"id": "6088d26c-e076-4a2a-aadb-1af04680cb18", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-902518023", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44714ba6-ad", "ovs_interfaceid": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.733207] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Releasing lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.735910] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-vif-plugged-44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 712.735910] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Acquiring lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.735910] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.735910] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.735910] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] No waiting events found dispatching network-vif-plugged-44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 712.735910] env[67144]: WARNING nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received unexpected event network-vif-plugged-44714ba6-ad01-48a3-bfe7-d65dc34dd361 for instance with vm_state building and task_state spawning. [ 712.735910] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-changed-44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 712.735910] env[67144]: DEBUG nova.compute.manager [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Refreshing instance network info cache due to event network-changed-44714ba6-ad01-48a3-bfe7-d65dc34dd361. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 712.735910] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Acquiring lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.735910] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Acquired lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.735910] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Refreshing network info cache for port 44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 713.088672] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updated VIF entry in instance network info cache for port 44714ba6-ad01-48a3-bfe7-d65dc34dd361. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 713.088672] env[67144]: DEBUG nova.network.neutron [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updating instance_info_cache with network_info: [{"id": "8e006206-1f62-42fa-b1da-025935a88d27", "address": "fa:16:3e:f5:0c:2e", "network": {"id": "27a55d29-4601-4852-b0d8-40b8547f86ef", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1311409486", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.192", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e006206-1f", "ovs_interfaceid": "8e006206-1f62-42fa-b1da-025935a88d27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "address": "fa:16:3e:48:aa:43", "network": {"id": "6088d26c-e076-4a2a-aadb-1af04680cb18", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-902518023", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.150", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6fb0104-186b-4288-b87e-634893f46f01", "external-id": "nsx-vlan-transportzone-73", "segmentation_id": 73, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44714ba6-ad", "ovs_interfaceid": "44714ba6-ad01-48a3-bfe7-d65dc34dd361", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.098065] env[67144]: DEBUG oslo_concurrency.lockutils [req-94612171-bdc3-4784-852a-1d4b6e7cc388 req-8a38d733-46a6-4182-ab72-9a1ad7594193 service nova] Releasing lock "refresh_cache-f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.768966] env[67144]: WARNING oslo_vmware.rw_handles [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 714.768966] env[67144]: ERROR oslo_vmware.rw_handles [ 714.769645] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 714.771464] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 714.771754] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Copying Virtual Disk [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/ba9e1370-3859-4686-ab96-021b32c36c3b/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 714.774484] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a18d1633-1836-4636-b414-6db984130732 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.784346] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Waiting for the task: (returnval){ [ 714.784346] env[67144]: value = "task-2848048" [ 714.784346] env[67144]: _type = "Task" [ 714.784346] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 714.794954] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Task: {'id': task-2848048, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 715.296499] env[67144]: DEBUG oslo_vmware.exceptions [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 715.296785] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.299208] env[67144]: ERROR nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 715.299208] env[67144]: Faults: ['InvalidArgument'] [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] Traceback (most recent call last): [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] yield resources [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self.driver.spawn(context, instance, image_meta, [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self._fetch_image_if_missing(context, vi) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] image_cache(vi, tmp_image_ds_loc) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] vm_util.copy_virtual_disk( [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] session._wait_for_task(vmdk_copy_task) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return self.wait_for_task(task_ref) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return evt.wait() [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] result = hub.switch() [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return self.greenlet.switch() [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self.f(*self.args, **self.kw) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] raise exceptions.translate_fault(task_info.error) [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] Faults: ['InvalidArgument'] [ 715.299208] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] [ 715.299985] env[67144]: INFO nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Terminating instance [ 715.301396] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.301595] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 715.302506] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 715.302712] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 715.302939] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9adffbb8-c745-45a2-bbdf-b2d233a963f8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.308769] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e41188d9-0899-4349-b3f9-dd429a62fe4c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.314252] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 715.314508] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-77bcdc45-e74a-4103-81be-2797fec11508 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.318449] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 715.318920] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 715.319609] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85fc8186-66f3-4a72-85b2-c40a1597948d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.325350] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for the task: (returnval){ [ 715.325350] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5226cc62-f9fe-70a3-fda0-9486d4ac2769" [ 715.325350] env[67144]: _type = "Task" [ 715.325350] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.342237] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 715.342874] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Creating directory with path [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 715.342874] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bb7e57c5-3e71-48cf-8360-4ab68195b7b5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.371023] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Created directory with path [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 715.371023] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Fetch image to [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 715.371023] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 715.371023] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf4fcb24-5d08-4c17-9af8-2c35970d7619 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.378721] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b3f34c7-d9c3-4743-bd78-857ba1cbebd7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.394108] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de1bb83c-28f3-42e2-9c22-2bc7691d76a2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.403580] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 715.403943] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 715.404228] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Deleting the datastore file [datastore1] 54af505e-0f30-4848-bd14-04461db40664 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 715.405280] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-18d4fed2-20c3-4da1-8eda-4cdec476b047 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.436902] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e4d7fe-f149-4fe8-b792-f305fabddd57 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.442316] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Waiting for the task: (returnval){ [ 715.442316] env[67144]: value = "task-2848050" [ 715.442316] env[67144]: _type = "Task" [ 715.442316] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 715.447413] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e4eb471b-aef0-4080-8556-d498b28c1d35 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.454844] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Task: {'id': task-2848050, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 715.477061] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "c3621484-8333-4375-9700-62b08d90887f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.477439] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "c3621484-8333-4375-9700-62b08d90887f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.486050] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 715.516639] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f19d2ac8-ce01-4774-90b8-44cf7886f473 tempest-SecurityGroupsTestJSON-806931160 tempest-SecurityGroupsTestJSON-806931160-project-member] Acquiring lock "eebe36ea-6a07-4806-bade-4222dcf24247" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.516871] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f19d2ac8-ce01-4774-90b8-44cf7886f473 tempest-SecurityGroupsTestJSON-806931160 tempest-SecurityGroupsTestJSON-806931160-project-member] Lock "eebe36ea-6a07-4806-bade-4222dcf24247" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.546453] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 715.608679] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 715.608872] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 715.957285] env[67144]: DEBUG oslo_vmware.api [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Task: {'id': task-2848050, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073683} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 715.957580] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 715.957798] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 715.958119] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 715.958342] env[67144]: INFO nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Took 0.66 seconds to destroy the instance on the hypervisor. [ 715.960752] env[67144]: DEBUG nova.compute.claims [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 715.961026] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.961130] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.981550] env[67144]: DEBUG oslo_concurrency.lockutils [None req-34da9e61-6b9f-4fea-b7c9-4c1ab530d84d tempest-VolumesAdminNegativeTest-109429009 tempest-VolumesAdminNegativeTest-109429009-project-member] Acquiring lock "3a37ecb3-0196-4230-adea-ed14355ece08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.981906] env[67144]: DEBUG oslo_concurrency.lockutils [None req-34da9e61-6b9f-4fea-b7c9-4c1ab530d84d tempest-VolumesAdminNegativeTest-109429009 tempest-VolumesAdminNegativeTest-109429009-project-member] Lock "3a37ecb3-0196-4230-adea-ed14355ece08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.317822] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcea7023-f738-452b-a948-7eb7749b3464 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.327176] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07d88ed-9270-459e-9202-025059651188 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.362416] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5004be33-98fb-4148-9983-43f83392e70d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.372733] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ed36fc-2ad7-43f0-b67a-8366bbfa000d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.388021] env[67144]: DEBUG nova.compute.provider_tree [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.403168] env[67144]: DEBUG nova.scheduler.client.report [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.492514] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.531s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.493085] env[67144]: ERROR nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 716.493085] env[67144]: Faults: ['InvalidArgument'] [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] Traceback (most recent call last): [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self.driver.spawn(context, instance, image_meta, [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self._vmops.spawn(context, instance, image_meta, injected_files, [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self._fetch_image_if_missing(context, vi) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] image_cache(vi, tmp_image_ds_loc) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] vm_util.copy_virtual_disk( [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] session._wait_for_task(vmdk_copy_task) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return self.wait_for_task(task_ref) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return evt.wait() [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] result = hub.switch() [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] return self.greenlet.switch() [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] self.f(*self.args, **self.kw) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] raise exceptions.translate_fault(task_info.error) [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] Faults: ['InvalidArgument'] [ 716.493085] env[67144]: ERROR nova.compute.manager [instance: 54af505e-0f30-4848-bd14-04461db40664] [ 716.493887] env[67144]: DEBUG nova.compute.utils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] VimFaultException {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 716.495412] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Build of instance 54af505e-0f30-4848-bd14-04461db40664 was re-scheduled: A specified parameter was not correct: fileType [ 716.495412] env[67144]: Faults: ['InvalidArgument'] {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 716.495840] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 716.495954] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 716.496125] env[67144]: DEBUG nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.496357] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.374319] env[67144]: DEBUG nova.network.neutron [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.390921] env[67144]: INFO nova.compute.manager [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] [instance: 54af505e-0f30-4848-bd14-04461db40664] Took 0.89 seconds to deallocate network for instance. [ 717.514098] env[67144]: INFO nova.scheduler.client.report [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Deleted allocations for instance 54af505e-0f30-4848-bd14-04461db40664 [ 717.531793] env[67144]: DEBUG oslo_concurrency.lockutils [None req-feb11af4-61ee-4148-9477-90ec9eedd335 tempest-ServerDiagnosticsTest-447605912 tempest-ServerDiagnosticsTest-447605912-project-member] Lock "54af505e-0f30-4848-bd14-04461db40664" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 105.897s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.569460] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 717.626041] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.626306] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.627834] env[67144]: INFO nova.compute.claims [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 717.969409] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1af4e7f-9a68-44e7-b531-f5a9b114bda5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.978239] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f7bd00c-b7cb-4362-8e91-1a631e7571bf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.022559] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7ec2dcd-bf8a-492c-a2c8-2d8ee9f0ac27 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.031636] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5385af7d-3a48-4bb3-a229-9b1cfb453252 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.055291] env[67144]: DEBUG nova.compute.provider_tree [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.064898] env[67144]: DEBUG nova.scheduler.client.report [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.080797] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.453s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.080797] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 718.131791] env[67144]: DEBUG nova.compute.utils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 718.134745] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 718.135053] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 718.148182] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 718.219898] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 718.245794] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 718.245988] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 718.246204] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 718.246464] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 718.246625] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 718.246772] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 718.247141] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 718.247205] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 718.247370] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 718.247532] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 718.247703] env[67144]: DEBUG nova.virt.hardware [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 718.248556] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f2b536b-6e32-43fc-a61e-19c51c4553aa {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.258705] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca6e1e0d-d0c3-4177-8090-938dcbc76548 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.290317] env[67144]: DEBUG oslo_concurrency.lockutils [None req-d704f149-ad26-4304-bd93-53bc9d920373 tempest-ServersTestJSON-1313036657 tempest-ServersTestJSON-1313036657-project-member] Acquiring lock "abda0de6-f344-4dd1-b439-42826b59de5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.290545] env[67144]: DEBUG oslo_concurrency.lockutils [None req-d704f149-ad26-4304-bd93-53bc9d920373 tempest-ServersTestJSON-1313036657 tempest-ServersTestJSON-1313036657-project-member] Lock "abda0de6-f344-4dd1-b439-42826b59de5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.424700] env[67144]: DEBUG nova.policy [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25cc1682cb8c462180e9c52dd0ced64a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5c724dee60f4549aab934f2de854021', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.194990] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Successfully created port: 5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 720.959548] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Successfully updated port: 5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 720.975580] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.976703] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquired lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 720.976907] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.048848] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.242398] env[67144]: DEBUG nova.compute.manager [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Received event network-vif-plugged-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 721.243645] env[67144]: DEBUG oslo_concurrency.lockutils [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] Acquiring lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.243645] env[67144]: DEBUG oslo_concurrency.lockutils [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] Lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.244942] env[67144]: DEBUG oslo_concurrency.lockutils [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] Lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.245132] env[67144]: DEBUG nova.compute.manager [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] No waiting events found dispatching network-vif-plugged-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 721.245558] env[67144]: WARNING nova.compute.manager [req-ac0a9919-9788-494b-aa7e-0eb92f0d8baf req-ea9e134e-bb73-4782-b759-de2ba042b10f service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Received unexpected event network-vif-plugged-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 for instance with vm_state building and task_state spawning. [ 721.329166] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Updating instance_info_cache with network_info: [{"id": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "address": "fa:16:3e:cf:ed:51", "network": {"id": "3efa7474-79fe-493f-8236-a26b8946be57", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-596250889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d5c724dee60f4549aab934f2de854021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bf86b133-2b7b-4cab-8f6f-5a0856d34c7b", "external-id": "nsx-vlan-transportzone-557", "segmentation_id": 557, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c68f2d8-00", "ovs_interfaceid": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.342969] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Releasing lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.343377] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance network_info: |[{"id": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "address": "fa:16:3e:cf:ed:51", "network": {"id": "3efa7474-79fe-493f-8236-a26b8946be57", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-596250889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d5c724dee60f4549aab934f2de854021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bf86b133-2b7b-4cab-8f6f-5a0856d34c7b", "external-id": "nsx-vlan-transportzone-557", "segmentation_id": 557, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c68f2d8-00", "ovs_interfaceid": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 721.344020] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:ed:51', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bf86b133-2b7b-4cab-8f6f-5a0856d34c7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c68f2d8-0093-49d8-9fa5-2933ec72f8c0', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 721.352689] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Creating folder: Project (d5c724dee60f4549aab934f2de854021). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.353306] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bdcf55ca-6f59-44c0-80cb-3e76f1581a0a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.365871] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Created folder: Project (d5c724dee60f4549aab934f2de854021) in parent group-v572613. [ 721.366115] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Creating folder: Instances. Parent ref: group-v572655. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.366630] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ae5e692c-0d24-446b-a65c-ec8a9e6d8e49 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.379356] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Created folder: Instances in parent group-v572655. [ 721.379619] env[67144]: DEBUG oslo.service.loopingcall [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 721.379809] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 721.380014] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-db816b2d-f1ac-4590-a1da-7899f5864d72 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.401561] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 721.401561] env[67144]: value = "task-2848053" [ 721.401561] env[67144]: _type = "Task" [ 721.401561] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 721.412028] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848053, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 721.809231] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "b932a680-76a5-4f08-ac38-2fc1578b4a86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.809518] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "b932a680-76a5-4f08-ac38-2fc1578b4a86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.912927] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848053, 'name': CreateVM_Task, 'duration_secs': 0.327759} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 721.913365] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 721.914068] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.914240] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.914559] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 721.914803] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ecd56873-6ff0-4b4a-89b8-8e949a3f278b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.921064] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Waiting for the task: (returnval){ [ 721.921064] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]522b1683-bd08-0336-b5e9-c6164aa47d21" [ 721.921064] env[67144]: _type = "Task" [ 721.921064] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 721.931882] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]522b1683-bd08-0336-b5e9-c6164aa47d21, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.434290] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.434290] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 722.434290] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.496875] env[67144]: DEBUG oslo_concurrency.lockutils [None req-144b1a49-1d95-4d3d-86a1-ef360ebc0355 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "56ba6c8d-1717-4d07-b547-7872f985b0f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.497188] env[67144]: DEBUG oslo_concurrency.lockutils [None req-144b1a49-1d95-4d3d-86a1-ef360ebc0355 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "56ba6c8d-1717-4d07-b547-7872f985b0f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.281186] env[67144]: DEBUG nova.compute.manager [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Received event network-changed-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 723.281380] env[67144]: DEBUG nova.compute.manager [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Refreshing instance network info cache due to event network-changed-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 723.281587] env[67144]: DEBUG oslo_concurrency.lockutils [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] Acquiring lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.281728] env[67144]: DEBUG oslo_concurrency.lockutils [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] Acquired lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.281884] env[67144]: DEBUG nova.network.neutron [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Refreshing network info cache for port 5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 723.654676] env[67144]: DEBUG nova.network.neutron [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Updated VIF entry in instance network info cache for port 5c68f2d8-0093-49d8-9fa5-2933ec72f8c0. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 723.655286] env[67144]: DEBUG nova.network.neutron [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Updating instance_info_cache with network_info: [{"id": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "address": "fa:16:3e:cf:ed:51", "network": {"id": "3efa7474-79fe-493f-8236-a26b8946be57", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-596250889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d5c724dee60f4549aab934f2de854021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bf86b133-2b7b-4cab-8f6f-5a0856d34c7b", "external-id": "nsx-vlan-transportzone-557", "segmentation_id": 557, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c68f2d8-00", "ovs_interfaceid": "5c68f2d8-0093-49d8-9fa5-2933ec72f8c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.674437] env[67144]: DEBUG oslo_concurrency.lockutils [req-edfcefc8-5444-49bb-ae6a-c25db9e6749c req-5edd44c5-e017-4907-bc06-2d04b26ee2a2 service nova] Releasing lock "refresh_cache-b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 724.533414] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b3c7b758-fbcf-4844-9177-f4e7f25caab5 tempest-AttachVolumeTestJSON-1172703336 tempest-AttachVolumeTestJSON-1172703336-project-member] Acquiring lock "41193ca9-3f5f-43a2-9335-1010b1f752a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.533710] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b3c7b758-fbcf-4844-9177-f4e7f25caab5 tempest-AttachVolumeTestJSON-1172703336 tempest-AttachVolumeTestJSON-1172703336-project-member] Lock "41193ca9-3f5f-43a2-9335-1010b1f752a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.320862] env[67144]: DEBUG oslo_concurrency.lockutils [None req-06269f76-02bd-4854-a336-baa7f50f48fb tempest-ServerRescueTestJSON-2036457023 tempest-ServerRescueTestJSON-2036457023-project-member] Acquiring lock "e64bc93e-f99f-4f9e-a41e-283d405b1b92" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.321183] env[67144]: DEBUG oslo_concurrency.lockutils [None req-06269f76-02bd-4854-a336-baa7f50f48fb tempest-ServerRescueTestJSON-2036457023 tempest-ServerRescueTestJSON-2036457023-project-member] Lock "e64bc93e-f99f-4f9e-a41e-283d405b1b92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.350166] env[67144]: DEBUG oslo_concurrency.lockutils [None req-4507ce0d-70e8-4105-ab23-3a7cd8cfb758 tempest-ServerDiagnosticsV248Test-2018146697 tempest-ServerDiagnosticsV248Test-2018146697-project-member] Acquiring lock "07259d91-ca24-4e5e-8340-d72f3b8e2776" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.350572] env[67144]: DEBUG oslo_concurrency.lockutils [None req-4507ce0d-70e8-4105-ab23-3a7cd8cfb758 tempest-ServerDiagnosticsV248Test-2018146697 tempest-ServerDiagnosticsV248Test-2018146697-project-member] Lock "07259d91-ca24-4e5e-8340-d72f3b8e2776" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.418026] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.429826] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.429826] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.429826] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.429826] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 750.429826] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a27098e5-7ff1-451e-bc27-1601c66807b5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.438555] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41d70ae8-9d48-41eb-94db-f4ce2234ce5d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.453874] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec4c968c-b566-4e1d-8e02-2d72cb5c81c0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.460346] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88620bc1-23fa-4706-b2e2-0e5487ab616e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.490313] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181057MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 750.490475] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.490678] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.556219] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b04052f8-b29f-4b32-b249-02b83d3d77f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.556382] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 99cbc3d9-8c82-4a32-8adb-59572bab2eca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.556511] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.556633] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c2d5335a-4332-4828-855d-380cdea64a1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.556753] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.556874] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.557089] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.557167] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance f61f525f-70a5-402f-bf52-0bd4041b907f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.557284] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.557400] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.583174] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.607457] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 42ce3afe-e725-4688-b048-bd6721c22c35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.618363] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 48037468-8c60-4449-8297-46eadab5246e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.629145] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c3621484-8333-4375-9700-62b08d90887f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.644372] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance eebe36ea-6a07-4806-bade-4222dcf24247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.654764] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3a37ecb3-0196-4230-adea-ed14355ece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.664917] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance abda0de6-f344-4dd1-b439-42826b59de5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.676466] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b932a680-76a5-4f08-ac38-2fc1578b4a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.686481] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 56ba6c8d-1717-4d07-b547-7872f985b0f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.696409] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 41193ca9-3f5f-43a2-9335-1010b1f752a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.705545] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e64bc93e-f99f-4f9e-a41e-283d405b1b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.714680] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 07259d91-ca24-4e5e-8340-d72f3b8e2776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.714871] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 750.715037] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 750.947976] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4de5d019-f7e4-44dd-b51e-c6ee123cfdd9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.955451] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f22b9028-4d0c-45eb-b2a7-a60f9f710e6a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.984731] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-805be8a8-de84-4310-8fc4-b8e3581f4d3a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.991434] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a66015e-8ce3-4a87-9fb6-e300b2063695 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.004251] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 751.012369] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 751.025212] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 751.025400] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.024670] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.025147] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 752.025147] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 752.046748] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047027] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047492] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047492] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047492] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047669] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047767] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.047900] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.048032] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.048156] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 752.048274] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 752.048901] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.049192] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.049479] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.416980] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.417341] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.417419] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.417541] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 754.417815] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 764.728484] env[67144]: WARNING oslo_vmware.rw_handles [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 764.728484] env[67144]: ERROR oslo_vmware.rw_handles [ 764.729140] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 764.730281] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 764.730518] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Copying Virtual Disk [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/b817c67e-7594-45f2-a45a-95b69baf5394/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 764.730789] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e46d980e-ed1c-48d3-a96c-be0320e79e77 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.739285] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for the task: (returnval){ [ 764.739285] env[67144]: value = "task-2848054" [ 764.739285] env[67144]: _type = "Task" [ 764.739285] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 764.746818] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Task: {'id': task-2848054, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.250262] env[67144]: DEBUG oslo_vmware.exceptions [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 765.250510] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.251064] env[67144]: ERROR nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.251064] env[67144]: Faults: ['InvalidArgument'] [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Traceback (most recent call last): [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] yield resources [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self.driver.spawn(context, instance, image_meta, [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self._fetch_image_if_missing(context, vi) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] image_cache(vi, tmp_image_ds_loc) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] vm_util.copy_virtual_disk( [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] session._wait_for_task(vmdk_copy_task) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return self.wait_for_task(task_ref) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return evt.wait() [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] result = hub.switch() [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return self.greenlet.switch() [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self.f(*self.args, **self.kw) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] raise exceptions.translate_fault(task_info.error) [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Faults: ['InvalidArgument'] [ 765.251064] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] [ 765.251899] env[67144]: INFO nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Terminating instance [ 765.252841] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.253106] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 765.253336] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d3c500ac-88e8-4814-a31e-269d34276c2e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.255376] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.255519] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquired lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.255682] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 765.262888] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 765.263884] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 765.264804] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98cdeb93-327a-4981-b246-7827b8bfff49 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.273467] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Waiting for the task: (returnval){ [ 765.273467] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52e5b7f5-b10b-8a78-e116-249acd3b1c21" [ 765.273467] env[67144]: _type = "Task" [ 765.273467] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 765.280800] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52e5b7f5-b10b-8a78-e116-249acd3b1c21, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.295215] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.426205] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.434889] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Releasing lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.435299] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 765.435491] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 765.436549] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f835fda-fba3-476b-956b-faa8217b9731 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.444508] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 765.444722] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bd780367-b4f9-436d-93e8-0ebc9db49d4e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.473912] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 765.474192] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 765.474406] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Deleting the datastore file [datastore1] 99cbc3d9-8c82-4a32-8adb-59572bab2eca {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 765.474794] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e4c99cbc-4b91-487b-910e-0f0c7627961f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.480392] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for the task: (returnval){ [ 765.480392] env[67144]: value = "task-2848056" [ 765.480392] env[67144]: _type = "Task" [ 765.480392] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 765.488365] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Task: {'id': task-2848056, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.783925] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 765.784277] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Creating directory with path [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 765.784446] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-63c9d097-ca6d-4efd-b3e4-f9ad89fc68e4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.796513] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Created directory with path [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 765.797530] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Fetch image to [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 765.797530] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 765.797704] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4e8498-52a6-437c-b407-0e1abb0f7953 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.805047] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-520206f3-93fd-4ece-94d5-3b61783f0876 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.813206] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12a73f0b-b177-48ed-84d6-1eff4e5700b6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.843851] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c514d808-9ba9-4dff-a288-9da38a9693a7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.849744] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0f651243-57fc-4101-9346-a699fd3fa2e5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.870136] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 765.916813] env[67144]: DEBUG oslo_vmware.rw_handles [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 765.976337] env[67144]: DEBUG oslo_vmware.rw_handles [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 765.976502] env[67144]: DEBUG oslo_vmware.rw_handles [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 765.990181] env[67144]: DEBUG oslo_vmware.api [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Task: {'id': task-2848056, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041799} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 765.990410] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 765.990586] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 765.990750] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 765.990916] env[67144]: INFO nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Took 0.56 seconds to destroy the instance on the hypervisor. [ 765.991157] env[67144]: DEBUG oslo.service.loopingcall [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 765.991348] env[67144]: DEBUG nova.compute.manager [-] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Skipping network deallocation for instance since networking was not requested. {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 765.993396] env[67144]: DEBUG nova.compute.claims [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 765.993567] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.993771] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.266977] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f19401db-44fd-4144-808b-563dacc9fede {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.274133] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e9ef70d-e561-4722-9254-d39a9622a093 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.302980] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8f699fd-1a26-405c-a08c-a282943df8c9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.309614] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d591039c-5ed9-40b2-a381-d221e67eb059 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.321972] env[67144]: DEBUG nova.compute.provider_tree [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 766.330836] env[67144]: DEBUG nova.scheduler.client.report [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 766.343816] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.347042] env[67144]: ERROR nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 766.347042] env[67144]: Faults: ['InvalidArgument'] [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Traceback (most recent call last): [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self.driver.spawn(context, instance, image_meta, [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self._fetch_image_if_missing(context, vi) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] image_cache(vi, tmp_image_ds_loc) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] vm_util.copy_virtual_disk( [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] session._wait_for_task(vmdk_copy_task) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return self.wait_for_task(task_ref) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return evt.wait() [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] result = hub.switch() [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] return self.greenlet.switch() [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] self.f(*self.args, **self.kw) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] raise exceptions.translate_fault(task_info.error) [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Faults: ['InvalidArgument'] [ 766.347042] env[67144]: ERROR nova.compute.manager [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] [ 766.347042] env[67144]: DEBUG nova.compute.utils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] VimFaultException {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 766.347042] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Build of instance 99cbc3d9-8c82-4a32-8adb-59572bab2eca was re-scheduled: A specified parameter was not correct: fileType [ 766.347042] env[67144]: Faults: ['InvalidArgument'] {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 766.347042] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 766.348025] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquiring lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 766.348025] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Acquired lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 766.348025] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 766.382074] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 766.482711] env[67144]: DEBUG nova.network.neutron [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.492610] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Releasing lock "refresh_cache-99cbc3d9-8c82-4a32-8adb-59572bab2eca" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 766.492832] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 766.495282] env[67144]: DEBUG nova.compute.manager [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] [instance: 99cbc3d9-8c82-4a32-8adb-59572bab2eca] Skipping network deallocation for instance since networking was not requested. {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 766.577662] env[67144]: INFO nova.scheduler.client.report [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Deleted allocations for instance 99cbc3d9-8c82-4a32-8adb-59572bab2eca [ 766.594807] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a349202e-c873-4977-a6b9-3a16583d3d77 tempest-ServersAdmin275Test-2004066370 tempest-ServersAdmin275Test-2004066370-project-member] Lock "99cbc3d9-8c82-4a32-8adb-59572bab2eca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.566s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.611389] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 766.659561] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.659812] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.661261] env[67144]: INFO nova.compute.claims [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 766.948389] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2f3dc02-b1c8-4a7e-84ae-fa4b1ef94278 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.956306] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53c42c48-8c73-49a0-afaf-16d61658912f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.985813] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4f9af9e-b222-4e56-a160-c86850e4562e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.992846] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b42e4c52-889f-497f-ada8-85af4a747923 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.007470] env[67144]: DEBUG nova.compute.provider_tree [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 767.016173] env[67144]: DEBUG nova.scheduler.client.report [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 767.030020] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.369s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.030020] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 767.060462] env[67144]: DEBUG nova.compute.utils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 767.061821] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 767.061995] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 767.069702] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 767.115416] env[67144]: DEBUG nova.policy [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2941a3a277b949bf837e7887b455417a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af67cc35c9164a509559178d0af9a42a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 767.128135] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 767.148764] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 767.149009] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 767.149188] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 767.149380] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 767.149524] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 767.149667] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 767.149868] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 767.150032] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 767.150203] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 767.150361] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 767.150524] env[67144]: DEBUG nova.virt.hardware [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 767.151407] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38939eb4-764a-4aa9-92e3-a35870b44bdd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.159418] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c101dfb-1d52-4cf6-af54-63a6bc148ede {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.693614] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Successfully created port: 7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 768.836747] env[67144]: DEBUG nova.compute.manager [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Received event network-vif-plugged-7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 768.836983] env[67144]: DEBUG oslo_concurrency.lockutils [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] Acquiring lock "0811722e-2ae9-4018-a85d-ab4fe5f46370-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 768.837159] env[67144]: DEBUG oslo_concurrency.lockutils [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] Lock "0811722e-2ae9-4018-a85d-ab4fe5f46370-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.837329] env[67144]: DEBUG oslo_concurrency.lockutils [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] Lock "0811722e-2ae9-4018-a85d-ab4fe5f46370-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 768.837506] env[67144]: DEBUG nova.compute.manager [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] No waiting events found dispatching network-vif-plugged-7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 768.837697] env[67144]: WARNING nova.compute.manager [req-0ffe3b70-cfcc-4082-9efb-9140b951384c req-08fa1e6b-f7a2-4605-a078-16383b1f6979 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Received unexpected event network-vif-plugged-7b56e275-2ddc-4bd6-b538-86c7cc0571c7 for instance with vm_state building and task_state spawning. [ 768.938297] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Successfully updated port: 7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 768.945477] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.945637] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquired lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.945790] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 769.012687] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.334911] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Updating instance_info_cache with network_info: [{"id": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "address": "fa:16:3e:6e:d5:72", "network": {"id": "7df45e3c-21c8-417f-8c4a-12f22e240c49", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-179949099-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "af67cc35c9164a509559178d0af9a42a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "193994c7-8e1b-4f25-a4a4-d0563845eb28", "external-id": "nsx-vlan-transportzone-607", "segmentation_id": 607, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56e275-2d", "ovs_interfaceid": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.355599] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Releasing lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.356972] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance network_info: |[{"id": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "address": "fa:16:3e:6e:d5:72", "network": {"id": "7df45e3c-21c8-417f-8c4a-12f22e240c49", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-179949099-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "af67cc35c9164a509559178d0af9a42a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "193994c7-8e1b-4f25-a4a4-d0563845eb28", "external-id": "nsx-vlan-transportzone-607", "segmentation_id": 607, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56e275-2d", "ovs_interfaceid": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 769.357610] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:d5:72', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '193994c7-8e1b-4f25-a4a4-d0563845eb28', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7b56e275-2ddc-4bd6-b538-86c7cc0571c7', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 769.365789] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Creating folder: Project (af67cc35c9164a509559178d0af9a42a). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 769.366666] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92292499-7f3a-41f0-8c4c-d81394703806 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.377909] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Created folder: Project (af67cc35c9164a509559178d0af9a42a) in parent group-v572613. [ 769.378207] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Creating folder: Instances. Parent ref: group-v572658. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 769.378455] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1b4a7c9-c434-4aa1-b89e-a9a43f0741da {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.389196] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Created folder: Instances in parent group-v572658. [ 769.389443] env[67144]: DEBUG oslo.service.loopingcall [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 769.389626] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 769.389817] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6a18e5c8-44d1-48ca-bd7a-3f3fe7a2700b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.411918] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 769.411918] env[67144]: value = "task-2848059" [ 769.411918] env[67144]: _type = "Task" [ 769.411918] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 769.418817] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848059, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 769.925560] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848059, 'name': CreateVM_Task, 'duration_secs': 0.299829} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 769.925783] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 769.930449] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.930629] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.931012] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 769.931224] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf55d354-d6ba-48ce-9060-888a2089e18c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.936184] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Waiting for the task: (returnval){ [ 769.936184] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]528df977-a3ee-ee6d-5dc4-b468a2f89e36" [ 769.936184] env[67144]: _type = "Task" [ 769.936184] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 769.948357] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]528df977-a3ee-ee6d-5dc4-b468a2f89e36, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 770.446011] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 770.446011] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 770.446153] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 770.872129] env[67144]: DEBUG nova.compute.manager [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Received event network-changed-7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 770.873660] env[67144]: DEBUG nova.compute.manager [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Refreshing instance network info cache due to event network-changed-7b56e275-2ddc-4bd6-b538-86c7cc0571c7. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 770.873660] env[67144]: DEBUG oslo_concurrency.lockutils [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] Acquiring lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 770.873660] env[67144]: DEBUG oslo_concurrency.lockutils [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] Acquired lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 770.873660] env[67144]: DEBUG nova.network.neutron [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Refreshing network info cache for port 7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 771.393116] env[67144]: DEBUG nova.network.neutron [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Updated VIF entry in instance network info cache for port 7b56e275-2ddc-4bd6-b538-86c7cc0571c7. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 771.393459] env[67144]: DEBUG nova.network.neutron [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Updating instance_info_cache with network_info: [{"id": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "address": "fa:16:3e:6e:d5:72", "network": {"id": "7df45e3c-21c8-417f-8c4a-12f22e240c49", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-179949099-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "af67cc35c9164a509559178d0af9a42a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "193994c7-8e1b-4f25-a4a4-d0563845eb28", "external-id": "nsx-vlan-transportzone-607", "segmentation_id": 607, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56e275-2d", "ovs_interfaceid": "7b56e275-2ddc-4bd6-b538-86c7cc0571c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 771.404234] env[67144]: DEBUG oslo_concurrency.lockutils [req-055a7f4d-359d-4313-868f-3cdfe862aa5b req-f2f805d4-26ea-4c5f-b007-b49bdb085f66 service nova] Releasing lock "refresh_cache-0811722e-2ae9-4018-a85d-ab4fe5f46370" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 811.416538] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.416359] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.416546] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 812.416781] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 812.436381] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.436542] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.436675] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.436804] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.436967] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437144] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437274] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437396] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437516] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437635] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.437810] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 812.438604] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.438783] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.438940] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.452640] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 812.452869] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 812.453046] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 812.453209] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 812.454248] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bd7e4cf-2f19-4c20-8c7c-99546f991e67 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.462828] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af003974-223e-4da6-9c0d-8b71473f8c9a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.476670] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a3bff76-168a-486f-8994-f213491b116c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.482944] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a63d4b-7786-4771-b5cb-7e62ae5c0d5f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.513054] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181029MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 812.513226] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 812.513437] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 812.581458] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b04052f8-b29f-4b32-b249-02b83d3d77f9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.581616] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.581743] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c2d5335a-4332-4828-855d-380cdea64a1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.581866] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.581985] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.582123] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.582244] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance f61f525f-70a5-402f-bf52-0bd4041b907f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.582361] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.582476] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.582591] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 812.594399] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 42ce3afe-e725-4688-b048-bd6721c22c35 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.606423] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 48037468-8c60-4449-8297-46eadab5246e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.617500] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c3621484-8333-4375-9700-62b08d90887f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.629416] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance eebe36ea-6a07-4806-bade-4222dcf24247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.641539] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3a37ecb3-0196-4230-adea-ed14355ece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.652256] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance abda0de6-f344-4dd1-b439-42826b59de5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.666751] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b932a680-76a5-4f08-ac38-2fc1578b4a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.679174] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 56ba6c8d-1717-4d07-b547-7872f985b0f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.690404] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 41193ca9-3f5f-43a2-9335-1010b1f752a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.702890] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e64bc93e-f99f-4f9e-a41e-283d405b1b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.714391] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 07259d91-ca24-4e5e-8340-d72f3b8e2776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.714648] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 812.714793] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 813.037090] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4bfd329-eb01-4f72-b790-22bb4a00129d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.045657] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b1dfd61-c3a9-49c7-8d85-61861ba7309e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.078943] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edbf8618-576a-4c61-aab8-f602daf1a3f6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.087175] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-463a15ab-52bd-45b5-b879-c6c8e837b9c7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.102227] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 813.111277] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 813.126695] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 813.127118] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.835037] env[67144]: DEBUG nova.compute.manager [req-ec5dc591-f502-4059-81b1-b21d885e86c5 req-0a3818b8-0ad9-4f13-8ab7-e525a27f32dd service nova] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Received event network-vif-deleted-123b0146-c529-4dd6-800b-2e7bbbcb716b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 814.105284] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.105353] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.127049] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.127049] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 814.417148] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.744012] env[67144]: WARNING oslo_vmware.rw_handles [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 814.744012] env[67144]: ERROR oslo_vmware.rw_handles [ 814.744458] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 814.746087] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 814.746328] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Copying Virtual Disk [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/5060b5ad-d461-459c-843e-89fdda513acf/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 814.746606] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6540a6f7-61df-4575-900a-66126da0ee52 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.754599] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Waiting for the task: (returnval){ [ 814.754599] env[67144]: value = "task-2848060" [ 814.754599] env[67144]: _type = "Task" [ 814.754599] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 814.762910] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Task: {'id': task-2848060, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.267982] env[67144]: DEBUG oslo_vmware.exceptions [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 815.267982] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 815.268561] env[67144]: ERROR nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 815.268561] env[67144]: Faults: ['InvalidArgument'] [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Traceback (most recent call last): [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] yield resources [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] self.driver.spawn(context, instance, image_meta, [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] self._fetch_image_if_missing(context, vi) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] image_cache(vi, tmp_image_ds_loc) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] vm_util.copy_virtual_disk( [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] session._wait_for_task(vmdk_copy_task) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] return self.wait_for_task(task_ref) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] return evt.wait() [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] result = hub.switch() [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] return self.greenlet.switch() [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] self.f(*self.args, **self.kw) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] raise exceptions.translate_fault(task_info.error) [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Faults: ['InvalidArgument'] [ 815.268561] env[67144]: ERROR nova.compute.manager [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] [ 815.272204] env[67144]: INFO nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Terminating instance [ 815.272425] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 815.273059] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 815.273651] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 815.275886] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 815.275886] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-191f58e6-c473-4eeb-b3fc-ddd04fb1ba5a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.278027] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593edb75-41d5-42d1-b018-d89516183e4c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.286022] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 815.286581] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-34f5c4ac-9dff-44bf-a133-57cf2c0311a8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.288104] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 815.288289] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 815.288949] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f8c3316f-c73b-46ca-b5c5-54dfc047ad90 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.294109] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 815.294109] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5247b713-f42e-91cf-a896-e67794fc0cd9" [ 815.294109] env[67144]: _type = "Task" [ 815.294109] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 815.301801] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5247b713-f42e-91cf-a896-e67794fc0cd9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.359329] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 815.359709] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 815.359946] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Deleting the datastore file [datastore1] b04052f8-b29f-4b32-b249-02b83d3d77f9 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 815.360264] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-164f5c39-5b41-4ebd-97b6-127722294d16 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.366826] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Waiting for the task: (returnval){ [ 815.366826] env[67144]: value = "task-2848062" [ 815.366826] env[67144]: _type = "Task" [ 815.366826] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 815.374762] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Task: {'id': task-2848062, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.416475] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.805900] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 815.806379] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating directory with path [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 815.806460] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-43e6cd42-bf0c-4d35-a971-956e1161bd62 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.818519] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created directory with path [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 815.818728] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Fetch image to [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 815.818897] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 815.820463] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-344062b8-137d-47ff-838e-ff7c44e8f9d0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.828874] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9667359-7261-44e9-873b-a89f4f314d88 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.839036] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-683eae45-1455-45b6-9c88-2d704f7b1937 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.881239] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93786ad9-bcf4-409f-b667-5a524fc85348 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.889036] env[67144]: DEBUG oslo_vmware.api [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Task: {'id': task-2848062, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076558} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 815.889294] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-17d12a4c-f802-45fc-83ec-50a9b5d42e43 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 815.891166] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 815.891363] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 815.891541] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 815.891713] env[67144]: INFO nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Took 0.62 seconds to destroy the instance on the hypervisor. [ 815.893770] env[67144]: DEBUG nova.compute.claims [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 815.893911] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.894423] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 815.929026] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 815.929026] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.035s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 815.929759] env[67144]: DEBUG nova.compute.utils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance b04052f8-b29f-4b32-b249-02b83d3d77f9 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 815.931295] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 815.931460] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 815.931617] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 815.931779] env[67144]: DEBUG nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 815.931933] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 815.962382] env[67144]: DEBUG nova.network.neutron [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 815.975386] env[67144]: INFO nova.compute.manager [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Took 0.04 seconds to deallocate network for instance. [ 815.984143] env[67144]: DEBUG oslo_vmware.rw_handles [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 816.044158] env[67144]: DEBUG oslo_vmware.rw_handles [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 816.044158] env[67144]: DEBUG oslo_vmware.rw_handles [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 816.060775] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f01566c2-7f87-4e8a-b8bc-069eab247223 tempest-ServersTestFqdnHostnames-803617242 tempest-ServersTestFqdnHostnames-803617242-project-member] Lock "b04052f8-b29f-4b32-b249-02b83d3d77f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.513s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.073298] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 816.150741] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.151020] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.152741] env[67144]: INFO nova.compute.claims [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 816.482549] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e366324-a381-4868-ae7f-f80c29aafbec {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.491365] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf074978-e1d3-4a00-a8d6-0383751a884e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.522236] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abde7736-cac5-4365-8844-4dcaba0e310d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.529393] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e21fda2d-13b3-4461-8ab8-5741783143a6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.542291] env[67144]: DEBUG nova.compute.provider_tree [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 816.550892] env[67144]: DEBUG nova.scheduler.client.report [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 816.564385] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.564855] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 816.602675] env[67144]: DEBUG nova.compute.utils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 816.603902] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 816.604088] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 816.619718] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 816.693430] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 816.717764] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 816.717952] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 816.718130] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 816.718310] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 816.718453] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 816.718599] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 816.719062] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 816.719375] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 816.719577] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 816.719784] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 816.719997] env[67144]: DEBUG nova.virt.hardware [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 816.720878] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4af5b3b-4b26-430f-a866-85e634e1c4b9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.725205] env[67144]: DEBUG nova.policy [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '091e7893bcfa442c83b01895ef4058ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '724052bc544a4c22a63326ac6a11b0a3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 816.736814] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f5165b8-8302-4d09-af99-4d927e6a0c40 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.754529] env[67144]: DEBUG nova.compute.manager [req-621465f4-dc69-45f7-a9e7-6fb277cb394a req-4de8d950-7ed2-4c80-a9ef-376028056ad5 service nova] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Received event network-vif-deleted-a7100a4b-e3a9-4a93-91f1-054a28c8a5f5 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 817.347932] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Successfully created port: a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 818.592479] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Successfully updated port: a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 818.603009] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 818.603165] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquired lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 818.603314] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 818.661246] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 818.953550] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Updating instance_info_cache with network_info: [{"id": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "address": "fa:16:3e:e0:47:1d", "network": {"id": "6b7fc84a-2aab-46aa-8ec5-b427d5f491a5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-775593760-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724052bc544a4c22a63326ac6a11b0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f17856cf-7248-414b-bde6-8c90cfb4c593", "external-id": "nsx-vlan-transportzone-341", "segmentation_id": 341, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2e066a2-ec", "ovs_interfaceid": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.966921] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Releasing lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 818.966921] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance network_info: |[{"id": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "address": "fa:16:3e:e0:47:1d", "network": {"id": "6b7fc84a-2aab-46aa-8ec5-b427d5f491a5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-775593760-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724052bc544a4c22a63326ac6a11b0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f17856cf-7248-414b-bde6-8c90cfb4c593", "external-id": "nsx-vlan-transportzone-341", "segmentation_id": 341, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2e066a2-ec", "ovs_interfaceid": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 818.968577] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e0:47:1d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f17856cf-7248-414b-bde6-8c90cfb4c593', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a2e066a2-ec0b-4533-92d3-97cddba99b24', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 818.979015] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Creating folder: Project (724052bc544a4c22a63326ac6a11b0a3). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 818.979667] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a89cc7a3-6fbc-4be8-a50b-d269bc9a8e1b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.991879] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Created folder: Project (724052bc544a4c22a63326ac6a11b0a3) in parent group-v572613. [ 818.992119] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Creating folder: Instances. Parent ref: group-v572661. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 818.992358] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b10f73a-74f4-4be9-a34e-0f1316bfa535 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 819.002221] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Created folder: Instances in parent group-v572661. [ 819.002321] env[67144]: DEBUG oslo.service.loopingcall [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 819.002520] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 819.002683] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fd5474e8-e12a-4703-ab5e-e04a7452e972 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 819.022053] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 819.022053] env[67144]: value = "task-2848065" [ 819.022053] env[67144]: _type = "Task" [ 819.022053] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 819.030448] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848065, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 819.221784] env[67144]: DEBUG nova.compute.manager [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Received event network-vif-plugged-a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 819.222104] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Acquiring lock "42ce3afe-e725-4688-b048-bd6721c22c35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 819.222383] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Lock "42ce3afe-e725-4688-b048-bd6721c22c35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 819.222470] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Lock "42ce3afe-e725-4688-b048-bd6721c22c35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 819.222662] env[67144]: DEBUG nova.compute.manager [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] No waiting events found dispatching network-vif-plugged-a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 819.222778] env[67144]: WARNING nova.compute.manager [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Received unexpected event network-vif-plugged-a2e066a2-ec0b-4533-92d3-97cddba99b24 for instance with vm_state building and task_state spawning. [ 819.222935] env[67144]: DEBUG nova.compute.manager [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Received event network-changed-a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 819.223105] env[67144]: DEBUG nova.compute.manager [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Refreshing instance network info cache due to event network-changed-a2e066a2-ec0b-4533-92d3-97cddba99b24. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 819.223284] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Acquiring lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 819.223417] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Acquired lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 819.223566] env[67144]: DEBUG nova.network.neutron [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Refreshing network info cache for port a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 819.531311] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848065, 'name': CreateVM_Task, 'duration_secs': 0.291329} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 819.531499] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 819.532132] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 819.532477] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 819.532591] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 819.532877] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4d8c2f4e-cd43-412c-9454-279c181e358e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 819.537169] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Waiting for the task: (returnval){ [ 819.537169] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52603579-64e5-78e2-6281-008aab66d8ec" [ 819.537169] env[67144]: _type = "Task" [ 819.537169] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 819.544707] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52603579-64e5-78e2-6281-008aab66d8ec, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 819.871963] env[67144]: DEBUG nova.network.neutron [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Updated VIF entry in instance network info cache for port a2e066a2-ec0b-4533-92d3-97cddba99b24. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 819.872337] env[67144]: DEBUG nova.network.neutron [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Updating instance_info_cache with network_info: [{"id": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "address": "fa:16:3e:e0:47:1d", "network": {"id": "6b7fc84a-2aab-46aa-8ec5-b427d5f491a5", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-775593760-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "724052bc544a4c22a63326ac6a11b0a3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f17856cf-7248-414b-bde6-8c90cfb4c593", "external-id": "nsx-vlan-transportzone-341", "segmentation_id": 341, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2e066a2-ec", "ovs_interfaceid": "a2e066a2-ec0b-4533-92d3-97cddba99b24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.881728] env[67144]: DEBUG oslo_concurrency.lockutils [req-f32f9113-85c5-425c-9936-92388b2d4bc0 req-5707e263-bc91-476c-9272-b316d0a28e3b service nova] Releasing lock "refresh_cache-42ce3afe-e725-4688-b048-bd6721c22c35" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 820.047194] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 820.047477] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 820.047699] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 820.499247] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "670f3974-b332-48c2-9aab-6a9ed01731b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.499650] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "670f3974-b332-48c2-9aab-6a9ed01731b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.817215] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.304423] env[67144]: DEBUG nova.compute.manager [req-148c7cc8-cd9c-4dfd-923d-9601fd6fb3c9 req-d52e4e3b-fd06-4406-9b83-93339e795d82 service nova] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Received event network-vif-deleted-618495ef-fa31-4a5f-bc87-1e975278e852 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 821.304679] env[67144]: DEBUG nova.compute.manager [req-148c7cc8-cd9c-4dfd-923d-9601fd6fb3c9 req-d52e4e3b-fd06-4406-9b83-93339e795d82 service nova] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Received event network-vif-deleted-062893e1-cc24-4478-a285-0cabddeb2f43 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 824.316459] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquiring lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 824.316861] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.825348] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.597661] env[67144]: DEBUG oslo_concurrency.lockutils [None req-6209bf46-9f50-4da5-acbd-db94347a1282 tempest-ServersTestJSON-1067565229 tempest-ServersTestJSON-1067565229-project-member] Acquiring lock "e32c24e1-485d-48b9-827b-fceb6828510c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.598023] env[67144]: DEBUG oslo_concurrency.lockutils [None req-6209bf46-9f50-4da5-acbd-db94347a1282 tempest-ServersTestJSON-1067565229 tempest-ServersTestJSON-1067565229-project-member] Lock "e32c24e1-485d-48b9-827b-fceb6828510c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.161080] env[67144]: WARNING oslo_vmware.rw_handles [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 863.161080] env[67144]: ERROR oslo_vmware.rw_handles [ 863.161816] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 863.163407] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 863.163661] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Copying Virtual Disk [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/6dca3b0f-b061-4e95-a3ea-451768ef2a07/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 863.163946] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e7775e99-73a2-41ae-b06d-5851bc71bcf2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.172639] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 863.172639] env[67144]: value = "task-2848066" [ 863.172639] env[67144]: _type = "Task" [ 863.172639] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 863.181652] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848066, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.682657] env[67144]: DEBUG oslo_vmware.exceptions [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 863.682874] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 863.683456] env[67144]: ERROR nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 863.683456] env[67144]: Faults: ['InvalidArgument'] [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Traceback (most recent call last): [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] yield resources [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] self.driver.spawn(context, instance, image_meta, [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] self._fetch_image_if_missing(context, vi) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] image_cache(vi, tmp_image_ds_loc) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] vm_util.copy_virtual_disk( [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] session._wait_for_task(vmdk_copy_task) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] return self.wait_for_task(task_ref) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] return evt.wait() [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] result = hub.switch() [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] return self.greenlet.switch() [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] self.f(*self.args, **self.kw) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] raise exceptions.translate_fault(task_info.error) [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Faults: ['InvalidArgument'] [ 863.683456] env[67144]: ERROR nova.compute.manager [instance: c2d5335a-4332-4828-855d-380cdea64a1a] [ 863.684546] env[67144]: INFO nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Terminating instance [ 863.685320] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 863.685549] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 863.686219] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c018e48-99bc-4a4f-83d7-6074636fdf11 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.688128] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 863.688359] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 863.689068] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19b7c8b8-968d-4634-8f36-a993f68adebf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.695940] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 863.696172] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dfc150f9-171f-4aed-90f7-b6985bde2318 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.698429] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 863.698621] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 863.699549] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-18c26f66-336e-47c4-8f03-bba1d1fcd232 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.704697] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Waiting for the task: (returnval){ [ 863.704697] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]520cd83b-8e31-4044-a4ea-adba1c5e52f5" [ 863.704697] env[67144]: _type = "Task" [ 863.704697] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 863.714272] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]520cd83b-8e31-4044-a4ea-adba1c5e52f5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.767540] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 863.767771] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 863.767949] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Deleting the datastore file [datastore1] c2d5335a-4332-4828-855d-380cdea64a1a {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 863.768269] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db2a70d8-c1e9-477f-aba2-840a60a85d0a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.774652] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 863.774652] env[67144]: value = "task-2848068" [ 863.774652] env[67144]: _type = "Task" [ 863.774652] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 863.782389] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848068, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 864.215645] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 864.215960] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Creating directory with path [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 864.216160] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-699f2b5a-f683-4348-8586-612f9a39df2f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.227922] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Created directory with path [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 864.228117] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Fetch image to [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 864.228354] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 864.229084] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81d2ae36-7c9c-46fe-a252-bc2101c25554 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.235788] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233022d0-e6c9-48c6-a693-df8388da1110 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.244645] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05e2a54d-b73e-4d61-bf8e-c426573517b9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.274732] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b812ac2-2aa7-4e1a-976f-e8c3195d42fc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.285873] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7cc5aaf7-2691-42f7-8321-7fc6d59075ec {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.287534] env[67144]: DEBUG oslo_vmware.api [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848068, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083415} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 864.287763] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 864.287962] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 864.288186] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 864.288342] env[67144]: INFO nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 864.290473] env[67144]: DEBUG nova.compute.claims [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 864.290662] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 864.290873] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 864.318414] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.319315] env[67144]: DEBUG nova.compute.utils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance c2d5335a-4332-4828-855d-380cdea64a1a could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 864.322603] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 864.324695] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 864.324866] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 864.325044] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 864.325522] env[67144]: DEBUG nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 864.325522] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 864.362104] env[67144]: DEBUG nova.network.neutron [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.369926] env[67144]: DEBUG oslo_vmware.rw_handles [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 864.422861] env[67144]: INFO nova.compute.manager [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Took 0.10 seconds to deallocate network for instance. [ 864.427260] env[67144]: DEBUG oslo_vmware.rw_handles [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 864.427435] env[67144]: DEBUG oslo_vmware.rw_handles [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 864.467859] env[67144]: DEBUG oslo_concurrency.lockutils [None req-3eb75b15-14b4-471d-9209-4d48ae984830 tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "c2d5335a-4332-4828-855d-380cdea64a1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.180s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.476893] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 864.526527] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 864.526781] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 864.528692] env[67144]: INFO nova.compute.claims [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 864.799626] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb5ae930-5ad2-4c72-8093-adecadf387df {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.806979] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-053044ad-3e69-4f04-9eb8-7c87c0bc8326 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.835859] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81b62f65-1a75-43b3-ae00-3cf7ddc725c3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.842523] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8c7a068-8f0d-4f60-be25-7fe7804941cd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.856143] env[67144]: DEBUG nova.compute.provider_tree [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 864.864388] env[67144]: DEBUG nova.scheduler.client.report [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 864.877290] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.877614] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 864.917890] env[67144]: DEBUG nova.compute.utils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 864.917890] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 864.917890] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 864.933230] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 864.973282] env[67144]: DEBUG nova.policy [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3a2f2800d434bb20d79b74334a52f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c93e989646fe46b09003c5237ab8bf5c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 864.981701] env[67144]: INFO nova.virt.block_device [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Booting with volume 1036265c-b374-4797-9e2b-6cbedfa4e29e at /dev/sda [ 865.023719] env[67144]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-50ec69ec-4a09-4c5f-aeb6-4756c6c0763f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.033276] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdf71da9-ef48-4cf1-8c47-07ec9c2768e5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.060851] env[67144]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1a76d18e-3a8e-4e41-b93e-a63e9dec4ca8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.068301] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcab3192-c41f-48d5-9c17-701b44bd7310 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.096313] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01c340b5-89a6-4c38-b6c4-b2d9a6028a7d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.102395] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03bf317e-cf64-4213-9a07-6abbe901ed53 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.115788] env[67144]: DEBUG nova.virt.block_device [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updating existing volume attachment record: 1fab04a3-3eb3-4d52-b70b-7aa253cedbb8 {{(pid=67144) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 865.326525] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 865.327103] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 865.327320] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 865.327544] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 865.327766] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 865.327917] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 865.328087] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 865.328304] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 865.328462] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 865.328627] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 865.328785] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 865.328954] env[67144]: DEBUG nova.virt.hardware [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 865.330005] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2d5d21a-9c31-4c7b-a42c-dac59fdba842 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.338817] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef8fca7b-d350-477e-84da-2cb16113f2f4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.438362] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Successfully created port: 0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 866.211055] env[67144]: DEBUG nova.compute.manager [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Received event network-vif-plugged-0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 866.211055] env[67144]: DEBUG oslo_concurrency.lockutils [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] Acquiring lock "48037468-8c60-4449-8297-46eadab5246e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 866.211055] env[67144]: DEBUG oslo_concurrency.lockutils [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] Lock "48037468-8c60-4449-8297-46eadab5246e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 866.211055] env[67144]: DEBUG oslo_concurrency.lockutils [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] Lock "48037468-8c60-4449-8297-46eadab5246e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 866.211055] env[67144]: DEBUG nova.compute.manager [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] No waiting events found dispatching network-vif-plugged-0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 866.211055] env[67144]: WARNING nova.compute.manager [req-571ac32a-e171-4b59-b061-85cf57c9f767 req-59239f78-16a7-4982-aa8c-1cae2de01935 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Received unexpected event network-vif-plugged-0210506e-de13-4fd8-8b6a-9c567ab95a62 for instance with vm_state building and task_state spawning. [ 866.338535] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Successfully updated port: 0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 866.345928] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Acquiring lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 866.346098] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Acquired lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 866.346253] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 866.418257] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 866.690784] env[67144]: DEBUG nova.network.neutron [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updating instance_info_cache with network_info: [{"id": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "address": "fa:16:3e:dc:48:57", "network": {"id": "536d2b0a-a655-4189-8b42-5de8da99eac5", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-276573328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c93e989646fe46b09003c5237ab8bf5c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0210506e-de", "ovs_interfaceid": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 866.703183] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Releasing lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 866.703721] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Instance network_info: |[{"id": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "address": "fa:16:3e:dc:48:57", "network": {"id": "536d2b0a-a655-4189-8b42-5de8da99eac5", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-276573328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c93e989646fe46b09003c5237ab8bf5c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0210506e-de", "ovs_interfaceid": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 866.704644] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dc:48:57', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '51876cd6-d373-4edc-8595-254e5d631378', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0210506e-de13-4fd8-8b6a-9c567ab95a62', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 866.712343] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Creating folder: Project (c93e989646fe46b09003c5237ab8bf5c). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 866.712951] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b89cd611-7b0b-4f98-a26d-d7ff526fc41d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.727808] env[67144]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 866.727808] env[67144]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67144) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 866.728167] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Folder already exists: Project (c93e989646fe46b09003c5237ab8bf5c). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 866.728454] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Creating folder: Instances. Parent ref: group-v572645. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 866.728735] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b1ec0e50-a292-44bd-9af5-9374cf817dc6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.738195] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Created folder: Instances in parent group-v572645. [ 866.738470] env[67144]: DEBUG oslo.service.loopingcall [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 866.738601] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48037468-8c60-4449-8297-46eadab5246e] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 866.738790] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ef533560-6d38-4911-9d17-84944c07abcc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.757028] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 866.757028] env[67144]: value = "task-2848071" [ 866.757028] env[67144]: _type = "Task" [ 866.757028] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 866.764412] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848071, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 867.268181] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848071, 'name': CreateVM_Task, 'duration_secs': 0.312286} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 867.268494] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 48037468-8c60-4449-8297-46eadab5246e] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 867.269194] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'attachment_id': '1fab04a3-3eb3-4d52-b70b-7aa253cedbb8', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-572648', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'name': 'volume-1036265c-b374-4797-9e2b-6cbedfa4e29e', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '48037468-8c60-4449-8297-46eadab5246e', 'attached_at': '', 'detached_at': '', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'serial': '1036265c-b374-4797-9e2b-6cbedfa4e29e'}, 'device_type': None, 'mount_device': '/dev/sda', 'delete_on_termination': True, 'boot_index': 0, 'disk_bus': None, 'guest_format': None, 'volume_type': None}], 'swap': None} {{(pid=67144) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 867.269429] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Root volume attach. Driver type: vmdk {{(pid=67144) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 867.270210] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dd1642a-c178-45cf-80f7-82396deafc54 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.278451] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b65523f-e25b-4acb-8291-ab8727b2efcd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.284666] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ef18053-cfb5-4750-ae68-04d5fb32c860 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.290589] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-d6311b8b-7fea-4d48-9c43-b4e88d2c3c98 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.297041] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for the task: (returnval){ [ 867.297041] env[67144]: value = "task-2848072" [ 867.297041] env[67144]: _type = "Task" [ 867.297041] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 867.304264] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848072, 'name': RelocateVM_Task} progress is 5%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 867.806282] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848072, 'name': RelocateVM_Task, 'duration_secs': 0.3704} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 867.806559] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Volume attach. Driver type: vmdk {{(pid=67144) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 867.806733] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-572648', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'name': 'volume-1036265c-b374-4797-9e2b-6cbedfa4e29e', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '48037468-8c60-4449-8297-46eadab5246e', 'attached_at': '', 'detached_at': '', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'serial': '1036265c-b374-4797-9e2b-6cbedfa4e29e'} {{(pid=67144) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 867.807559] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-288f2f0e-38e3-493c-8fbd-9a182da6a156 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.824377] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96f53e5b-6faa-47be-9b93-f2159a4c776a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.845647] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Reconfiguring VM instance instance-0000000f to attach disk [datastore1] volume-1036265c-b374-4797-9e2b-6cbedfa4e29e/volume-1036265c-b374-4797-9e2b-6cbedfa4e29e.vmdk or device None with type thin {{(pid=67144) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 867.845892] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c4ce4f62-c1fa-43ff-a3d1-237984d6c4d0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 867.865421] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for the task: (returnval){ [ 867.865421] env[67144]: value = "task-2848073" [ 867.865421] env[67144]: _type = "Task" [ 867.865421] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 867.872992] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848073, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 868.266801] env[67144]: DEBUG nova.compute.manager [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Received event network-changed-0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 868.267014] env[67144]: DEBUG nova.compute.manager [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Refreshing instance network info cache due to event network-changed-0210506e-de13-4fd8-8b6a-9c567ab95a62. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 868.267233] env[67144]: DEBUG oslo_concurrency.lockutils [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] Acquiring lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 868.267418] env[67144]: DEBUG oslo_concurrency.lockutils [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] Acquired lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 868.267526] env[67144]: DEBUG nova.network.neutron [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Refreshing network info cache for port 0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 868.374933] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848073, 'name': ReconfigVM_Task, 'duration_secs': 0.250498} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 868.375218] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Reconfigured VM instance instance-0000000f to attach disk [datastore1] volume-1036265c-b374-4797-9e2b-6cbedfa4e29e/volume-1036265c-b374-4797-9e2b-6cbedfa4e29e.vmdk or device None with type thin {{(pid=67144) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 868.380230] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e47f2f1e-64a9-47ac-9897-3f65709fe58d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.396452] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for the task: (returnval){ [ 868.396452] env[67144]: value = "task-2848074" [ 868.396452] env[67144]: _type = "Task" [ 868.396452] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 868.404139] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848074, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 868.676656] env[67144]: DEBUG nova.network.neutron [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updated VIF entry in instance network info cache for port 0210506e-de13-4fd8-8b6a-9c567ab95a62. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 868.677270] env[67144]: DEBUG nova.network.neutron [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updating instance_info_cache with network_info: [{"id": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "address": "fa:16:3e:dc:48:57", "network": {"id": "536d2b0a-a655-4189-8b42-5de8da99eac5", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-276573328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c93e989646fe46b09003c5237ab8bf5c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0210506e-de", "ovs_interfaceid": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 868.687144] env[67144]: DEBUG oslo_concurrency.lockutils [req-d218e392-076b-4e77-8a20-c869a15375f4 req-467073e0-a1d4-4029-b3f7-948ba137b5c7 service nova] Releasing lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 868.908037] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848074, 'name': ReconfigVM_Task, 'duration_secs': 0.113746} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 868.908037] env[67144]: DEBUG nova.virt.vmwareapi.volumeops [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-572648', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'name': 'volume-1036265c-b374-4797-9e2b-6cbedfa4e29e', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '48037468-8c60-4449-8297-46eadab5246e', 'attached_at': '', 'detached_at': '', 'volume_id': '1036265c-b374-4797-9e2b-6cbedfa4e29e', 'serial': '1036265c-b374-4797-9e2b-6cbedfa4e29e'} {{(pid=67144) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 868.908576] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-ff872240-17a7-4f66-b56d-7ef51b425590 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.915128] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for the task: (returnval){ [ 868.915128] env[67144]: value = "task-2848075" [ 868.915128] env[67144]: _type = "Task" [ 868.915128] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 868.923058] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848075, 'name': Rename_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 869.425342] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848075, 'name': Rename_Task, 'duration_secs': 0.122854} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 869.425620] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Powering on the VM {{(pid=67144) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 869.425859] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-97f3572a-dc9a-4e0c-833c-a37db787fdcd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.432923] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Waiting for the task: (returnval){ [ 869.432923] env[67144]: value = "task-2848076" [ 869.432923] env[67144]: _type = "Task" [ 869.432923] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 869.440075] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848076, 'name': PowerOnVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 869.942905] env[67144]: DEBUG oslo_vmware.api [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Task: {'id': task-2848076, 'name': PowerOnVM_Task, 'duration_secs': 0.442068} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 869.943295] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Powered on the VM {{(pid=67144) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 869.943382] env[67144]: INFO nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Took 4.62 seconds to spawn the instance on the hypervisor. [ 869.943622] env[67144]: DEBUG nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Checking state {{(pid=67144) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 869.944406] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab96d85d-391e-431d-b9eb-8c194b24cda9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.990570] env[67144]: INFO nova.compute.manager [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] [instance: 48037468-8c60-4449-8297-46eadab5246e] Took 5.48 seconds to build instance. [ 870.001164] env[67144]: DEBUG oslo_concurrency.lockutils [None req-0f0a69ea-6f17-42c4-bbd4-fb61ea3a0a69 tempest-ServersTestBootFromVolume-622344632 tempest-ServersTestBootFromVolume-622344632-project-member] Lock "48037468-8c60-4449-8297-46eadab5246e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 158.798s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.012718] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 870.056672] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.056933] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.058952] env[67144]: INFO nova.compute.claims [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 870.336128] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f447760e-28ad-4107-b153-1e28d83b1898 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.344262] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d968ce0-a24a-4276-a281-2e516b6c3cd9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.375776] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44bdc4da-244b-4f5b-bad4-e7ce9d6aa3ee {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.383184] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a96dd15-5cbd-4c03-9a61-89239fd48200 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.396567] env[67144]: DEBUG nova.compute.provider_tree [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 870.407014] env[67144]: DEBUG nova.scheduler.client.report [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 870.416745] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.416908] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Cleaning up deleted instances {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 870.427158] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.369s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.427158] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 870.435459] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] There are 4 instances to clean {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 870.435723] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance has had 0 of 5 cleanup attempts {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 870.460013] env[67144]: DEBUG nova.compute.utils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 870.461328] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 870.461488] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 870.468818] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 870.472213] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c2d5335a-4332-4828-855d-380cdea64a1a] Instance has had 0 of 5 cleanup attempts {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 870.515746] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance has had 0 of 5 cleanup attempts {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 870.533445] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 870.537251] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b04052f8-b29f-4b32-b249-02b83d3d77f9] Instance has had 0 of 5 cleanup attempts {{(pid=67144) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 870.539562] env[67144]: DEBUG nova.policy [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2035947e7b424e03b500caa235e9bd86', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21b508a37e0a44f4890850c34340b8db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 870.564382] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 870.565112] env[67144]: DEBUG nova.virt.hardware [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 870.566123] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5723733e-4d72-4412-966a-d023cd666606 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.569842] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.571332] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Cleaning up deleted instances with incomplete migration {{(pid=67144) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 870.576767] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-907494eb-0eac-455e-8859-1de0405c05dd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.581840] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.210632] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Successfully created port: 3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 871.396222] env[67144]: DEBUG nova.compute.manager [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Received event network-changed-0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 871.396421] env[67144]: DEBUG nova.compute.manager [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Refreshing instance network info cache due to event network-changed-0210506e-de13-4fd8-8b6a-9c567ab95a62. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 871.396634] env[67144]: DEBUG oslo_concurrency.lockutils [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] Acquiring lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 871.396778] env[67144]: DEBUG oslo_concurrency.lockutils [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] Acquired lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 871.396936] env[67144]: DEBUG nova.network.neutron [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Refreshing network info cache for port 0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 871.827249] env[67144]: DEBUG nova.network.neutron [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updated VIF entry in instance network info cache for port 0210506e-de13-4fd8-8b6a-9c567ab95a62. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 871.827620] env[67144]: DEBUG nova.network.neutron [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updating instance_info_cache with network_info: [{"id": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "address": "fa:16:3e:dc:48:57", "network": {"id": "536d2b0a-a655-4189-8b42-5de8da99eac5", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-276573328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.130", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c93e989646fe46b09003c5237ab8bf5c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0210506e-de", "ovs_interfaceid": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.840591] env[67144]: DEBUG oslo_concurrency.lockutils [req-3b5e9681-38a6-4e1c-9012-24b4f7f5e39e req-f5271341-99dc-4455-85ee-83ec5b1ef36c service nova] Releasing lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 872.038879] env[67144]: DEBUG nova.compute.manager [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Received event network-vif-plugged-3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 872.039116] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] Acquiring lock "c3621484-8333-4375-9700-62b08d90887f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.039688] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] Lock "c3621484-8333-4375-9700-62b08d90887f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.039875] env[67144]: DEBUG oslo_concurrency.lockutils [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] Lock "c3621484-8333-4375-9700-62b08d90887f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.040664] env[67144]: DEBUG nova.compute.manager [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] No waiting events found dispatching network-vif-plugged-3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 872.040664] env[67144]: WARNING nova.compute.manager [req-bd9866b6-cb5a-4d3f-8d53-5554edd789eb req-5df51301-4311-429f-b3d9-637fa5d28805 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Received unexpected event network-vif-plugged-3428e31d-47aa-4bc6-b54d-4580aafba111 for instance with vm_state building and task_state spawning. [ 872.223069] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Successfully updated port: 3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 872.231524] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 872.231833] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquired lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 872.232025] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 872.301707] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 872.586477] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Updating instance_info_cache with network_info: [{"id": "3428e31d-47aa-4bc6-b54d-4580aafba111", "address": "fa:16:3e:b0:81:6f", "network": {"id": "e998e2a4-2650-4aa1-9060-556aa5629dd1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1454877789-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21b508a37e0a44f4890850c34340b8db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3428e31d-47", "ovs_interfaceid": "3428e31d-47aa-4bc6-b54d-4580aafba111", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.591768] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.592031] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.592225] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.602285] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Releasing lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 872.602641] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance network_info: |[{"id": "3428e31d-47aa-4bc6-b54d-4580aafba111", "address": "fa:16:3e:b0:81:6f", "network": {"id": "e998e2a4-2650-4aa1-9060-556aa5629dd1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1454877789-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21b508a37e0a44f4890850c34340b8db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3428e31d-47", "ovs_interfaceid": "3428e31d-47aa-4bc6-b54d-4580aafba111", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 872.603349] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b0:81:6f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c883fb98-d172-4510-8cf4-07aafdf771af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3428e31d-47aa-4bc6-b54d-4580aafba111', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 872.612070] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Creating folder: Project (21b508a37e0a44f4890850c34340b8db). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.613076] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d43c5404-0265-4241-b61b-fc91c3eddfa4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.615868] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.616123] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.616335] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.616537] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 872.617706] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9666e56a-59c7-4855-a86b-18f0e2167fab {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.627381] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af071d3-a782-46d0-a1e7-4e7a6709968c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.632887] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Created folder: Project (21b508a37e0a44f4890850c34340b8db) in parent group-v572613. [ 872.633120] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Creating folder: Instances. Parent ref: group-v572666. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.633790] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35d33084-47ea-4a9a-816f-76f7ce6f1923 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.644736] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c30c457-762f-4679-afc6-9ca536023c82 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.652033] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dbb56bb-ca16-47f1-9b37-303e980387ce {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.656764] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Created folder: Instances in parent group-v572666. [ 872.657031] env[67144]: DEBUG oslo.service.loopingcall [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 872.657681] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c3621484-8333-4375-9700-62b08d90887f] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 872.658011] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-10656919-4886-4d77-bb00-caeeb953c5e8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.702870] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181063MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 872.703045] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.703256] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.709587] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 872.709587] env[67144]: value = "task-2848079" [ 872.709587] env[67144]: _type = "Task" [ 872.709587] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 872.717789] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848079, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 872.786072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance f61f525f-70a5-402f-bf52-0bd4041b907f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786437] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786437] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 42ce3afe-e725-4688-b048-bd6721c22c35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786586] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 48037468-8c60-4449-8297-46eadab5246e actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.786744] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance c3621484-8333-4375-9700-62b08d90887f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.800298] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance eebe36ea-6a07-4806-bade-4222dcf24247 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.817091] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3a37ecb3-0196-4230-adea-ed14355ece08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.828737] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance abda0de6-f344-4dd1-b439-42826b59de5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.844234] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b932a680-76a5-4f08-ac38-2fc1578b4a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.856746] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 56ba6c8d-1717-4d07-b547-7872f985b0f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.869312] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 41193ca9-3f5f-43a2-9335-1010b1f752a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.881216] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e64bc93e-f99f-4f9e-a41e-283d405b1b92 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.896428] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 07259d91-ca24-4e5e-8340-d72f3b8e2776 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.908297] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 670f3974-b332-48c2-9aab-6a9ed01731b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.923237] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.937444] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e32c24e1-485d-48b9-827b-fceb6828510c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.937714] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 872.937827] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 873.220722] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848079, 'name': CreateVM_Task, 'duration_secs': 0.320714} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 873.221913] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c3621484-8333-4375-9700-62b08d90887f] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 873.222686] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a378941a-f1b7-4dd8-a510-3820d73565c8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.226068] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 873.226372] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 873.226557] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 873.226803] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3086cad-3f57-43ff-9659-c9a02afdbb90 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.234605] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0d9bd00-94fe-4857-933c-2025bfffe1f5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.238018] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Waiting for the task: (returnval){ [ 873.238018] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5295d078-5401-1f60-d9d7-4cbe63c00888" [ 873.238018] env[67144]: _type = "Task" [ 873.238018] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 873.267148] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-248849d4-b33f-45d4-9ab9-caae273d0d05 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.273354] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 873.273599] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 873.273807] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 873.277506] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5e5ad31-3f36-477c-bc9f-e5e330f6761f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.290903] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 873.300541] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 873.317606] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 873.317811] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 874.073833] env[67144]: DEBUG nova.compute.manager [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Received event network-changed-3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 874.073976] env[67144]: DEBUG nova.compute.manager [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Refreshing instance network info cache due to event network-changed-3428e31d-47aa-4bc6-b54d-4580aafba111. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 874.074200] env[67144]: DEBUG oslo_concurrency.lockutils [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] Acquiring lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 874.074352] env[67144]: DEBUG oslo_concurrency.lockutils [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] Acquired lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 874.074641] env[67144]: DEBUG nova.network.neutron [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Refreshing network info cache for port 3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 874.141900] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.142104] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 874.142204] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 874.162743] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.162903] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163047] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163185] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163309] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163431] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163553] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.163679] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: c3621484-8333-4375-9700-62b08d90887f] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 874.206767] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 874.206923] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquired lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 874.207093] env[67144]: DEBUG nova.network.neutron [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 48037468-8c60-4449-8297-46eadab5246e] Forcefully refreshing network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2004}} [ 874.207278] env[67144]: DEBUG nova.objects.instance [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lazy-loading 'info_cache' on Instance uuid 48037468-8c60-4449-8297-46eadab5246e {{(pid=67144) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 874.349466] env[67144]: DEBUG nova.network.neutron [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Updated VIF entry in instance network info cache for port 3428e31d-47aa-4bc6-b54d-4580aafba111. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 874.349821] env[67144]: DEBUG nova.network.neutron [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Updating instance_info_cache with network_info: [{"id": "3428e31d-47aa-4bc6-b54d-4580aafba111", "address": "fa:16:3e:b0:81:6f", "network": {"id": "e998e2a4-2650-4aa1-9060-556aa5629dd1", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1454877789-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21b508a37e0a44f4890850c34340b8db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c883fb98-d172-4510-8cf4-07aafdf771af", "external-id": "nsx-vlan-transportzone-570", "segmentation_id": 570, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3428e31d-47", "ovs_interfaceid": "3428e31d-47aa-4bc6-b54d-4580aafba111", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 874.359494] env[67144]: DEBUG oslo_concurrency.lockutils [req-8253b03f-1f4c-4ecb-908f-19cb3e0ffcb1 req-fe89fbcd-3fae-4640-84b0-1073d87a7568 service nova] Releasing lock "refresh_cache-c3621484-8333-4375-9700-62b08d90887f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 874.477396] env[67144]: DEBUG nova.network.neutron [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updating instance_info_cache with network_info: [{"id": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "address": "fa:16:3e:dc:48:57", "network": {"id": "536d2b0a-a655-4189-8b42-5de8da99eac5", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-276573328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.130", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c93e989646fe46b09003c5237ab8bf5c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0210506e-de", "ovs_interfaceid": "0210506e-de13-4fd8-8b6a-9c567ab95a62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 874.486593] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Releasing lock "refresh_cache-48037468-8c60-4449-8297-46eadab5246e" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 874.486788] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 48037468-8c60-4449-8297-46eadab5246e] Updated the network info_cache for instance {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9885}} [ 874.486995] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.487171] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.487304] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 874.756651] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 875.415880] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.416081] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 890.188970] env[67144]: DEBUG nova.compute.manager [req-5e80105b-67ca-4b19-97a7-097db7a012db req-15dd8cc7-2f15-4218-b5e5-d89dd3bdec4a service nova] [instance: 48037468-8c60-4449-8297-46eadab5246e] Received event network-vif-deleted-0210506e-de13-4fd8-8b6a-9c567ab95a62 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 900.969144] env[67144]: DEBUG nova.compute.manager [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-vif-deleted-44714ba6-ad01-48a3-bfe7-d65dc34dd361 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 900.969434] env[67144]: INFO nova.compute.manager [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Neutron deleted interface 44714ba6-ad01-48a3-bfe7-d65dc34dd361; detaching it from the instance and deleting it from the info cache [ 900.969619] env[67144]: DEBUG nova.network.neutron [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Updating instance_info_cache with network_info: [{"id": "8e006206-1f62-42fa-b1da-025935a88d27", "address": "fa:16:3e:f5:0c:2e", "network": {"id": "27a55d29-4601-4852-b0d8-40b8547f86ef", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1311409486", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.192", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9d75445bcda7473ba3ae33ebf292a0c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e006206-1f", "ovs_interfaceid": "8e006206-1f62-42fa-b1da-025935a88d27", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.983766] env[67144]: DEBUG oslo_concurrency.lockutils [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] Acquiring lock "f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 903.184162] env[67144]: DEBUG nova.compute.manager [req-97c186b8-9501-4e05-915c-2eefd9d2a2eb req-fe456044-4d27-4e49-8e60-0a0cb04ce9c1 service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Received event network-vif-deleted-8e006206-1f62-42fa-b1da-025935a88d27 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 903.184162] env[67144]: DEBUG nova.compute.manager [req-97c186b8-9501-4e05-915c-2eefd9d2a2eb req-fe456044-4d27-4e49-8e60-0a0cb04ce9c1 service nova] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Received event network-vif-deleted-66332029-9ce1-424d-9899-20f64e4d004b {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 903.184162] env[67144]: DEBUG nova.compute.manager [req-97c186b8-9501-4e05-915c-2eefd9d2a2eb req-fe456044-4d27-4e49-8e60-0a0cb04ce9c1 service nova] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Received event network-vif-deleted-5c68f2d8-0093-49d8-9fa5-2933ec72f8c0 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 906.333564] env[67144]: DEBUG nova.compute.manager [req-ed301dfe-5fdd-4798-b3ed-8103e49d5e7a req-c68b48b8-2dd1-473f-afa4-5279bb0e8e27 service nova] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Received event network-vif-deleted-7b56e275-2ddc-4bd6-b538-86c7cc0571c7 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 909.545920] env[67144]: DEBUG nova.compute.manager [req-0962bad1-7e96-40e2-96d1-e796b507b4ee req-41613d32-0520-4c60-916b-54c1a02bf3be service nova] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Received event network-vif-deleted-a2e066a2-ec0b-4533-92d3-97cddba99b24 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 912.244236] env[67144]: DEBUG nova.compute.manager [req-78fc2398-c979-40ff-97a4-c9da802ff8d6 req-de0eb1bd-ee08-46f2-bb45-f6e6301b1048 service nova] [instance: c3621484-8333-4375-9700-62b08d90887f] Received event network-vif-deleted-3428e31d-47aa-4bc6-b54d-4580aafba111 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 913.394225] env[67144]: WARNING oslo_vmware.rw_handles [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 913.394225] env[67144]: ERROR oslo_vmware.rw_handles [ 913.394954] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 913.396410] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 913.396657] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Copying Virtual Disk [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/44b55686-2e1a-4515-88a5-148a2166f332/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 913.396930] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d2bef3b5-e121-41ae-bb7b-f37dbab4f96e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.404917] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Waiting for the task: (returnval){ [ 913.404917] env[67144]: value = "task-2848081" [ 913.404917] env[67144]: _type = "Task" [ 913.404917] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 913.413414] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Task: {'id': task-2848081, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 913.919454] env[67144]: DEBUG oslo_vmware.exceptions [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 913.919742] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 913.920338] env[67144]: ERROR nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 913.920338] env[67144]: Faults: ['InvalidArgument'] [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Traceback (most recent call last): [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] yield resources [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] self.driver.spawn(context, instance, image_meta, [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] self._vmops.spawn(context, instance, image_meta, injected_files, [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] self._fetch_image_if_missing(context, vi) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] image_cache(vi, tmp_image_ds_loc) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] vm_util.copy_virtual_disk( [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] session._wait_for_task(vmdk_copy_task) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] return self.wait_for_task(task_ref) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] return evt.wait() [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] result = hub.switch() [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] return self.greenlet.switch() [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] self.f(*self.args, **self.kw) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] raise exceptions.translate_fault(task_info.error) [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Faults: ['InvalidArgument'] [ 913.920338] env[67144]: ERROR nova.compute.manager [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] [ 913.921359] env[67144]: INFO nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Terminating instance [ 913.927619] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 913.927619] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 913.927619] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 913.927619] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 913.927619] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fea83598-38ca-4d69-879d-084a9543c9b7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.929472] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6619c944-41d3-4fcc-a74b-90fc197d0095 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.936317] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 913.938279] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5c6799f5-06a5-48ce-99e0-062835de9a24 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.939043] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 913.939221] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 913.941601] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-02658910-a2de-497f-8179-8874f160332c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.946794] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Waiting for the task: (returnval){ [ 913.946794] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52995f4d-8bf0-1008-bc5f-36d691891da1" [ 913.946794] env[67144]: _type = "Task" [ 913.946794] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 913.959722] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52995f4d-8bf0-1008-bc5f-36d691891da1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.015259] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 914.015469] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 914.015640] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Deleting the datastore file [datastore1] 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 914.015895] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-92071fbb-2aa0-4d0f-868c-914367d55adb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.022131] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Waiting for the task: (returnval){ [ 914.022131] env[67144]: value = "task-2848083" [ 914.022131] env[67144]: _type = "Task" [ 914.022131] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 914.029579] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Task: {'id': task-2848083, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.465916] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 914.466427] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Creating directory with path [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 914.466427] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ca54da5-01cc-4509-baf1-c2e6c10e4758 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.478837] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Created directory with path [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 914.479062] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Fetch image to [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 914.479239] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 914.480030] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f765968b-89f1-465d-a39a-40d034702c80 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.498621] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22037ddc-7832-41cb-8c5f-6d62cc23c7a6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.511119] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa0d971-daa1-45bf-8faf-a7c4731d6fd2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.550373] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83afc2c5-f28e-4f1a-95ef-b66b7b1354de {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.558678] env[67144]: DEBUG oslo_vmware.api [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Task: {'id': task-2848083, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064204} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 914.560492] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 914.561220] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 914.561220] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 914.561480] env[67144]: INFO nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Took 0.63 seconds to destroy the instance on the hypervisor. [ 914.563620] env[67144]: DEBUG nova.compute.claims [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 914.563793] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 914.564299] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 914.566990] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fcb33643-e4da-48ba-80e0-89c3ab1f8ebc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.588575] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 914.605019] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.041s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 914.605812] env[67144]: DEBUG nova.compute.utils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance 6cbf4358-dcfa-471b-ae1a-e6a512c47d26 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 914.608394] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 914.608394] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 914.608394] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 914.608394] env[67144]: DEBUG nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 914.608605] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 914.738688] env[67144]: DEBUG nova.network.neutron [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 914.754672] env[67144]: INFO nova.compute.manager [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] [instance: 6cbf4358-dcfa-471b-ae1a-e6a512c47d26] Took 0.15 seconds to deallocate network for instance. [ 914.760688] env[67144]: DEBUG oslo_vmware.rw_handles [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 914.868831] env[67144]: DEBUG oslo_vmware.rw_handles [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 914.869128] env[67144]: DEBUG oslo_vmware.rw_handles [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 914.961464] env[67144]: DEBUG oslo_concurrency.lockutils [None req-71dba420-0460-4dc8-a31c-a3b3a23278c0 tempest-ServersAdminNegativeTestJSON-1765808988 tempest-ServersAdminNegativeTestJSON-1765808988-project-member] Lock "6cbf4358-dcfa-471b-ae1a-e6a512c47d26" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 296.711s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 914.973790] env[67144]: DEBUG nova.compute.manager [None req-f19d2ac8-ce01-4774-90b8-44cf7886f473 tempest-SecurityGroupsTestJSON-806931160 tempest-SecurityGroupsTestJSON-806931160-project-member] [instance: eebe36ea-6a07-4806-bade-4222dcf24247] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 915.003044] env[67144]: DEBUG nova.compute.manager [None req-f19d2ac8-ce01-4774-90b8-44cf7886f473 tempest-SecurityGroupsTestJSON-806931160 tempest-SecurityGroupsTestJSON-806931160-project-member] [instance: eebe36ea-6a07-4806-bade-4222dcf24247] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 915.026484] env[67144]: DEBUG oslo_concurrency.lockutils [None req-f19d2ac8-ce01-4774-90b8-44cf7886f473 tempest-SecurityGroupsTestJSON-806931160 tempest-SecurityGroupsTestJSON-806931160-project-member] Lock "eebe36ea-6a07-4806-bade-4222dcf24247" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.509s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.037226] env[67144]: DEBUG nova.compute.manager [None req-34da9e61-6b9f-4fea-b7c9-4c1ab530d84d tempest-VolumesAdminNegativeTest-109429009 tempest-VolumesAdminNegativeTest-109429009-project-member] [instance: 3a37ecb3-0196-4230-adea-ed14355ece08] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 915.062885] env[67144]: DEBUG nova.compute.manager [None req-34da9e61-6b9f-4fea-b7c9-4c1ab530d84d tempest-VolumesAdminNegativeTest-109429009 tempest-VolumesAdminNegativeTest-109429009-project-member] [instance: 3a37ecb3-0196-4230-adea-ed14355ece08] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 915.086593] env[67144]: DEBUG oslo_concurrency.lockutils [None req-34da9e61-6b9f-4fea-b7c9-4c1ab530d84d tempest-VolumesAdminNegativeTest-109429009 tempest-VolumesAdminNegativeTest-109429009-project-member] Lock "3a37ecb3-0196-4230-adea-ed14355ece08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.105s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.096108] env[67144]: DEBUG nova.compute.manager [None req-d704f149-ad26-4304-bd93-53bc9d920373 tempest-ServersTestJSON-1313036657 tempest-ServersTestJSON-1313036657-project-member] [instance: abda0de6-f344-4dd1-b439-42826b59de5a] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 915.122674] env[67144]: DEBUG nova.compute.manager [None req-d704f149-ad26-4304-bd93-53bc9d920373 tempest-ServersTestJSON-1313036657 tempest-ServersTestJSON-1313036657-project-member] [instance: abda0de6-f344-4dd1-b439-42826b59de5a] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 915.144090] env[67144]: DEBUG oslo_concurrency.lockutils [None req-d704f149-ad26-4304-bd93-53bc9d920373 tempest-ServersTestJSON-1313036657 tempest-ServersTestJSON-1313036657-project-member] Lock "abda0de6-f344-4dd1-b439-42826b59de5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.853s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.158870] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 915.209394] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 915.209622] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 915.211097] env[67144]: INFO nova.compute.claims [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 915.283392] env[67144]: DEBUG nova.scheduler.client.report [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Refreshing inventories for resource provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 915.301104] env[67144]: DEBUG nova.scheduler.client.report [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Updating ProviderTree inventory for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 915.301104] env[67144]: DEBUG nova.compute.provider_tree [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Updating inventory in ProviderTree for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 915.316283] env[67144]: DEBUG nova.scheduler.client.report [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Refreshing aggregate associations for resource provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8, aggregates: None {{(pid=67144) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 915.339404] env[67144]: DEBUG nova.scheduler.client.report [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Refreshing trait associations for resource provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO {{(pid=67144) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 915.493547] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64dafab6-c081-4bad-8da6-f6225eafcfa1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.501935] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa36b00-7343-45c0-9c5a-c7a7a0dc72f4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.536348] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bfec379-47c3-42d1-9608-d5cabd85a525 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.543788] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d53e31a7-8f85-410f-af17-31c7c0743b48 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.557253] env[67144]: DEBUG nova.compute.provider_tree [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 915.569077] env[67144]: DEBUG nova.scheduler.client.report [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 915.590201] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.380s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.590859] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 915.632374] env[67144]: DEBUG nova.compute.utils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 915.633762] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Not allocating networking since 'none' was specified. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 915.644420] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 915.720894] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 915.743193] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 915.743427] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 915.743584] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 915.743767] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 915.743910] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 915.744074] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 915.744285] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 915.744442] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 915.744617] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 915.744796] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 915.744915] env[67144]: DEBUG nova.virt.hardware [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 915.745826] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dbc4ee1-5bbe-48fa-826d-85a9d96f9a8e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.753841] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c233628-de01-4b43-9938-781f79cb476a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.768163] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance VIF info [] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 915.773826] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Creating folder: Project (6af85a6704da4986810d4d07cea1ac1e). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 915.774115] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a829ca49-51b5-4d16-b02b-604a637e665b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.784940] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Created folder: Project (6af85a6704da4986810d4d07cea1ac1e) in parent group-v572613. [ 915.785028] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Creating folder: Instances. Parent ref: group-v572669. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 915.785374] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d3b1cbcb-aea8-4b12-89c2-82b201e7352f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.794402] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Created folder: Instances in parent group-v572669. [ 915.794637] env[67144]: DEBUG oslo.service.loopingcall [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 915.794821] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 915.795017] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eace1a06-6c54-4f72-84ce-cc124f392dc7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.810897] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 915.810897] env[67144]: value = "task-2848086" [ 915.810897] env[67144]: _type = "Task" [ 915.810897] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 915.818209] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848086, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 916.327100] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848086, 'name': CreateVM_Task, 'duration_secs': 0.25119} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 916.327100] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 916.327100] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 916.327100] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 916.327100] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 916.327100] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-79f71f13-e8f7-4391-b6be-f90454dc053c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.332378] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Waiting for the task: (returnval){ [ 916.332378] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52c323d1-24a9-83af-7e03-bd962aeab457" [ 916.332378] env[67144]: _type = "Task" [ 916.332378] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 916.340250] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52c323d1-24a9-83af-7e03-bd962aeab457, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 916.853220] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 916.853220] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 916.853220] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 932.415945] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.416275] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.427179] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.427391] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.427559] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 932.427716] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 932.428759] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05a80030-556f-43ad-9842-815943c53416 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.437649] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a784a714-32d3-4e1e-9a16-569597dedd95 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.451246] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e256ba7d-c46d-42d8-aa7b-de3f426160fb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.457637] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ea2b3e-6c0e-4d9e-bdd1-d02a5a35420e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.486634] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181065MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 932.486860] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.487059] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.532023] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 932.532023] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 932.544682] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 670f3974-b332-48c2-9aab-6a9ed01731b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 932.559490] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 932.572089] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e32c24e1-485d-48b9-827b-fceb6828510c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 932.572542] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 932.572670] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 932.653024] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-733f9d60-cd71-4abe-8756-1adac01f0d6f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.661015] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f9a027-43f0-42a3-b0b0-5d73ea719d0a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.692084] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3f4abf2-b2fc-4434-b7e0-7491039ab2c1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.699544] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-411eeed4-c4a4-487f-b8da-013173b1ff60 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.713182] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 932.723354] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 932.737906] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 932.738133] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.739213] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.416649] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.416897] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 935.416931] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 935.429356] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 935.429356] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 935.429467] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 935.429913] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.430123] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.430289] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 935.431293] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 936.425537] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 937.416762] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 938.414565] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 940.143087] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquiring lock "842426aa-72a3-4604-b50b-9705b55ea396" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.143087] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Lock "842426aa-72a3-4604-b50b-9705b55ea396" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.856488] env[67144]: WARNING oslo_vmware.rw_handles [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 964.856488] env[67144]: ERROR oslo_vmware.rw_handles [ 964.857293] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 964.858940] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 964.858940] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Copying Virtual Disk [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/44bbe015-49fa-4f62-95b7-10f37767519e/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 964.859265] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c7444e7b-f74e-4acd-8f55-54ec597856b4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.868738] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Waiting for the task: (returnval){ [ 964.868738] env[67144]: value = "task-2848087" [ 964.868738] env[67144]: _type = "Task" [ 964.868738] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 964.877152] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Task: {'id': task-2848087, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 965.379211] env[67144]: DEBUG oslo_vmware.exceptions [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 965.379486] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.379993] env[67144]: ERROR nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 965.379993] env[67144]: Faults: ['InvalidArgument'] [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] yield resources [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.driver.spawn(context, instance, image_meta, [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._fetch_image_if_missing(context, vi) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] image_cache(vi, tmp_image_ds_loc) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] vm_util.copy_virtual_disk( [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] session._wait_for_task(vmdk_copy_task) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.wait_for_task(task_ref) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return evt.wait() [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] result = hub.switch() [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.greenlet.switch() [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.f(*self.args, **self.kw) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise exceptions.translate_fault(task_info.error) [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Faults: ['InvalidArgument'] [ 965.379993] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 965.380846] env[67144]: INFO nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Terminating instance [ 965.381859] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 965.382078] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 965.382307] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e66eb111-0243-4a4d-af17-df74b1c99035 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.384436] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 965.384627] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 965.385322] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-937f8161-4943-4b15-9c0c-af4c8fea1097 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.391696] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 965.391890] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-68b4e39e-57fe-4653-be70-09cae11a2918 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.393921] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 965.394086] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 965.394959] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ab74b10-5f2c-409e-ab6a-286a27357bb0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.399229] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for the task: (returnval){ [ 965.399229] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52a6cd0c-5a03-fbd3-429f-436a4f8c886c" [ 965.399229] env[67144]: _type = "Task" [ 965.399229] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 965.406226] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52a6cd0c-5a03-fbd3-429f-436a4f8c886c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 965.461194] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 965.461327] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 965.461508] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Deleting the datastore file [datastore1] 5bb4c082-f5fc-42e6-891a-4866eef1add6 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 965.461764] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-39d61ff7-d9a7-438d-a496-194a05955a7f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.468155] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Waiting for the task: (returnval){ [ 965.468155] env[67144]: value = "task-2848089" [ 965.468155] env[67144]: _type = "Task" [ 965.468155] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 965.475595] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Task: {'id': task-2848089, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 965.909607] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 965.910026] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Creating directory with path [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 965.910110] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9ebc151-c764-4da2-ad2b-b1a66e53f44e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.936248] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Created directory with path [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 965.936438] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Fetch image to [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 965.936622] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 965.937353] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-581e2309-f964-4029-ada5-b0bed19782fa {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.954101] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1ef98c5-dcda-4b0d-8de5-630c913c1efe {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.966548] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29fc92d2-c5af-4bd3-a395-e3ed8a80d48b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.979281] env[67144]: DEBUG oslo_vmware.api [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Task: {'id': task-2848089, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080708} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 966.004330] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 966.004632] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 966.004766] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 966.004927] env[67144]: INFO nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Took 0.62 seconds to destroy the instance on the hypervisor. [ 966.007265] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e62d274c-1de5-4107-b78e-fe62b98a9727 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.010063] env[67144]: DEBUG nova.compute.claims [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 966.010238] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 966.010456] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 966.016302] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-25c0fa89-5fbf-4f91-a391-fd3812a8d085 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.034787] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.035538] env[67144]: DEBUG nova.compute.utils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 966.038416] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 966.041045] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 966.041145] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 966.041333] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 966.041629] env[67144]: DEBUG nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 966.041685] env[67144]: DEBUG nova.network.neutron [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 966.085447] env[67144]: DEBUG oslo_vmware.rw_handles [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 966.143318] env[67144]: DEBUG oslo_vmware.rw_handles [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 966.143575] env[67144]: DEBUG oslo_vmware.rw_handles [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 966.254682] env[67144]: DEBUG neutronclient.v2_0.client [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 966.258386] env[67144]: ERROR nova.compute.manager [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.driver.spawn(context, instance, image_meta, [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._fetch_image_if_missing(context, vi) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] image_cache(vi, tmp_image_ds_loc) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] vm_util.copy_virtual_disk( [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] session._wait_for_task(vmdk_copy_task) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.wait_for_task(task_ref) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return evt.wait() [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] result = hub.switch() [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.greenlet.switch() [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.f(*self.args, **self.kw) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise exceptions.translate_fault(task_info.error) [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Faults: ['InvalidArgument'] [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] During handling of the above exception, another exception occurred: [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._build_and_run_instance(context, instance, image, [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] with excutils.save_and_reraise_exception(): [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.force_reraise() [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 966.258386] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise self.value [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] with self.rt.instance_claim(context, instance, node, allocs, [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.abort() [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return f(*args, **kwargs) [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._unset_instance_host_and_node(instance) [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] instance.save() [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] updates, result = self.indirection_api.object_action( [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return cctxt.call(context, 'object_action', objinst=objinst, [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] result = self.transport._send( [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self._driver.send(target, ctxt, message, [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise result [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] nova.exception_Remote.InstanceNotFound_Remote: Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 could not be found. [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return getattr(target, method)(*args, **kwargs) [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return fn(self, *args, **kwargs) [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] old_ref, inst_ref = db.instance_update_and_get_original( [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return f(*args, **kwargs) [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 966.259558] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] with excutils.save_and_reraise_exception() as ectxt: [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.force_reraise() [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise self.value [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return f(*args, **kwargs) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return f(context, *args, **kwargs) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise exception.InstanceNotFound(instance_id=uuid) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] nova.exception.InstanceNotFound: Instance 5bb4c082-f5fc-42e6-891a-4866eef1add6 could not be found. [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] During handling of the above exception, another exception occurred: [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] exception_handler_v20(status_code, error_body) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise client_exc(message=error_message, [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Neutron server returns request_ids: ['req-8ee7d106-263f-4d8d-862d-2035de9c3218'] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] During handling of the above exception, another exception occurred: [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] Traceback (most recent call last): [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._deallocate_network(context, instance, requested_networks) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self.network_api.deallocate_for_instance( [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] data = neutron.list_ports(**search_opts) [ 966.260927] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.list('ports', self.ports_path, retrieve_all, [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] for r in self._pagination(collection, path, **params): [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] res = self.get(path, params=params) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.retry_request("GET", action, body=body, [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] return self.do_request(method, action, body=body, [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] ret = obj(*args, **kwargs) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] self._handle_fault_response(status_code, replybody, resp) [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] raise exception.Unauthorized() [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] nova.exception.Unauthorized: Not authorized. [ 966.261831] env[67144]: ERROR nova.compute.manager [instance: 5bb4c082-f5fc-42e6-891a-4866eef1add6] [ 966.281096] env[67144]: DEBUG oslo_concurrency.lockutils [None req-62039a75-8010-4d37-8521-1d40ec605817 tempest-ImagesOneServerTestJSON-905412758 tempest-ImagesOneServerTestJSON-905412758-project-member] Lock "5bb4c082-f5fc-42e6-891a-4866eef1add6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 346.227s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.290410] env[67144]: DEBUG nova.compute.manager [None req-144b1a49-1d95-4d3d-86a1-ef360ebc0355 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: 56ba6c8d-1717-4d07-b547-7872f985b0f3] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 966.316063] env[67144]: DEBUG nova.compute.manager [None req-144b1a49-1d95-4d3d-86a1-ef360ebc0355 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: 56ba6c8d-1717-4d07-b547-7872f985b0f3] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 966.335622] env[67144]: DEBUG oslo_concurrency.lockutils [None req-144b1a49-1d95-4d3d-86a1-ef360ebc0355 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "56ba6c8d-1717-4d07-b547-7872f985b0f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 243.838s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.343713] env[67144]: DEBUG nova.compute.manager [None req-b3c7b758-fbcf-4844-9177-f4e7f25caab5 tempest-AttachVolumeTestJSON-1172703336 tempest-AttachVolumeTestJSON-1172703336-project-member] [instance: 41193ca9-3f5f-43a2-9335-1010b1f752a1] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 966.365643] env[67144]: DEBUG nova.compute.manager [None req-b3c7b758-fbcf-4844-9177-f4e7f25caab5 tempest-AttachVolumeTestJSON-1172703336 tempest-AttachVolumeTestJSON-1172703336-project-member] [instance: 41193ca9-3f5f-43a2-9335-1010b1f752a1] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 966.384566] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b3c7b758-fbcf-4844-9177-f4e7f25caab5 tempest-AttachVolumeTestJSON-1172703336 tempest-AttachVolumeTestJSON-1172703336-project-member] Lock "41193ca9-3f5f-43a2-9335-1010b1f752a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.851s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.392105] env[67144]: DEBUG nova.compute.manager [None req-06269f76-02bd-4854-a336-baa7f50f48fb tempest-ServerRescueTestJSON-2036457023 tempest-ServerRescueTestJSON-2036457023-project-member] [instance: e64bc93e-f99f-4f9e-a41e-283d405b1b92] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 966.412198] env[67144]: DEBUG nova.compute.manager [None req-06269f76-02bd-4854-a336-baa7f50f48fb tempest-ServerRescueTestJSON-2036457023 tempest-ServerRescueTestJSON-2036457023-project-member] [instance: e64bc93e-f99f-4f9e-a41e-283d405b1b92] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 966.432520] env[67144]: DEBUG oslo_concurrency.lockutils [None req-06269f76-02bd-4854-a336-baa7f50f48fb tempest-ServerRescueTestJSON-2036457023 tempest-ServerRescueTestJSON-2036457023-project-member] Lock "e64bc93e-f99f-4f9e-a41e-283d405b1b92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.111s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.443049] env[67144]: DEBUG nova.compute.manager [None req-4507ce0d-70e8-4105-ab23-3a7cd8cfb758 tempest-ServerDiagnosticsV248Test-2018146697 tempest-ServerDiagnosticsV248Test-2018146697-project-member] [instance: 07259d91-ca24-4e5e-8340-d72f3b8e2776] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 966.464330] env[67144]: DEBUG nova.compute.manager [None req-4507ce0d-70e8-4105-ab23-3a7cd8cfb758 tempest-ServerDiagnosticsV248Test-2018146697 tempest-ServerDiagnosticsV248Test-2018146697-project-member] [instance: 07259d91-ca24-4e5e-8340-d72f3b8e2776] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 966.483876] env[67144]: DEBUG oslo_concurrency.lockutils [None req-4507ce0d-70e8-4105-ab23-3a7cd8cfb758 tempest-ServerDiagnosticsV248Test-2018146697 tempest-ServerDiagnosticsV248Test-2018146697-project-member] Lock "07259d91-ca24-4e5e-8340-d72f3b8e2776" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.133s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.491924] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 966.538445] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 966.538695] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 966.540131] env[67144]: INFO nova.compute.claims [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 966.655495] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d655c8-e536-485b-966e-22b21ee9864c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.664268] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-205daa75-4bce-4e9b-beed-9abc5e98b4f2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.693020] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dccac79-867d-443d-853e-d5531c8ec267 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.699537] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5eeda59-8fdb-4a0d-8e14-eb54469c0f3f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.712066] env[67144]: DEBUG nova.compute.provider_tree [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 966.720240] env[67144]: DEBUG nova.scheduler.client.report [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 966.732052] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.732494] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 966.760823] env[67144]: DEBUG nova.compute.utils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 966.761879] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 966.762083] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 966.771849] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 966.828146] env[67144]: DEBUG nova.policy [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bca0226d36648fc8bc370f16b62f1a4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38e6d6ab2a79447bb038b72c6787028f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 966.831086] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 966.850800] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 966.851024] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 966.851189] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 966.851369] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 966.851513] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 966.851661] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 966.851860] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 966.852027] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 966.852228] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 966.852408] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 966.852582] env[67144]: DEBUG nova.virt.hardware [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 966.853632] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50ccab73-35b7-4a01-bea0-b860bfd92ce1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.861039] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea1753d8-4587-471b-9ef5-e2d8c67b70cf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.103462] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Successfully created port: c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 967.610336] env[67144]: DEBUG nova.compute.manager [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Received event network-vif-plugged-c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 967.610533] env[67144]: DEBUG oslo_concurrency.lockutils [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] Acquiring lock "670f3974-b332-48c2-9aab-6a9ed01731b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 967.610726] env[67144]: DEBUG oslo_concurrency.lockutils [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] Lock "670f3974-b332-48c2-9aab-6a9ed01731b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 967.610920] env[67144]: DEBUG oslo_concurrency.lockutils [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] Lock "670f3974-b332-48c2-9aab-6a9ed01731b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 967.611079] env[67144]: DEBUG nova.compute.manager [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] No waiting events found dispatching network-vif-plugged-c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 967.611243] env[67144]: WARNING nova.compute.manager [req-380f9c82-b4fc-4fc2-85cd-b19ce831af43 req-05a765bb-4c0d-414b-a1ec-edf6a2c965cf service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Received unexpected event network-vif-plugged-c1570a75-1610-4f3f-bec4-98143f091678 for instance with vm_state building and task_state spawning. [ 967.651315] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Successfully updated port: c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 967.662196] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 967.662345] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 967.662491] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 967.697327] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 967.842402] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Updating instance_info_cache with network_info: [{"id": "c1570a75-1610-4f3f-bec4-98143f091678", "address": "fa:16:3e:14:86:78", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1570a75-16", "ovs_interfaceid": "c1570a75-1610-4f3f-bec4-98143f091678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.854784] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 967.855079] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance network_info: |[{"id": "c1570a75-1610-4f3f-bec4-98143f091678", "address": "fa:16:3e:14:86:78", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1570a75-16", "ovs_interfaceid": "c1570a75-1610-4f3f-bec4-98143f091678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 967.855448] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:14:86:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '27abaf31-0f39-428c-a8d3-cd7548de6818', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c1570a75-1610-4f3f-bec4-98143f091678', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 967.863183] env[67144]: DEBUG oslo.service.loopingcall [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 967.863612] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 967.863827] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-14acf129-5fa2-40c6-bb2c-5789c6202a4a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.884084] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 967.884084] env[67144]: value = "task-2848090" [ 967.884084] env[67144]: _type = "Task" [ 967.884084] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 967.891549] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848090, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 968.394573] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848090, 'name': CreateVM_Task, 'duration_secs': 0.294123} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 968.394858] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.395758] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 968.395758] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 968.395874] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 968.396051] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4cd77f74-6f8a-4920-8796-728d59a9ed1c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 968.400330] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 968.400330] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529938a5-be66-c0ef-fe04-564cabb875ab" [ 968.400330] env[67144]: _type = "Task" [ 968.400330] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 968.407526] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529938a5-be66-c0ef-fe04-564cabb875ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 968.911024] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 968.911299] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 968.911512] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 969.634743] env[67144]: DEBUG nova.compute.manager [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Received event network-changed-c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 969.635018] env[67144]: DEBUG nova.compute.manager [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Refreshing instance network info cache due to event network-changed-c1570a75-1610-4f3f-bec4-98143f091678. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 969.635173] env[67144]: DEBUG oslo_concurrency.lockutils [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] Acquiring lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 969.635382] env[67144]: DEBUG oslo_concurrency.lockutils [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] Acquired lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 969.635483] env[67144]: DEBUG nova.network.neutron [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Refreshing network info cache for port c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 969.864165] env[67144]: DEBUG nova.network.neutron [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Updated VIF entry in instance network info cache for port c1570a75-1610-4f3f-bec4-98143f091678. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 969.864559] env[67144]: DEBUG nova.network.neutron [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Updating instance_info_cache with network_info: [{"id": "c1570a75-1610-4f3f-bec4-98143f091678", "address": "fa:16:3e:14:86:78", "network": {"id": "55519089-83bf-4e48-a664-06123d7d91f1", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "d66969059be64e7d86646b564fe28c7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "27abaf31-0f39-428c-a8d3-cd7548de6818", "external-id": "nsx-vlan-transportzone-505", "segmentation_id": 505, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1570a75-16", "ovs_interfaceid": "c1570a75-1610-4f3f-bec4-98143f091678", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 969.873575] env[67144]: DEBUG oslo_concurrency.lockutils [req-e8c1bd88-d91a-4598-acaa-199215bbc2fb req-828385f5-4c5d-4702-8886-084a283fdf76 service nova] Releasing lock "refresh_cache-670f3974-b332-48c2-9aab-6a9ed01731b7" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 993.416473] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 993.426466] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 993.426680] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 993.426849] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 993.427015] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 993.428158] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-519c6c98-34e2-4e9d-a518-7faa2bbaee12 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.438596] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e85fb869-4c0c-4f58-baf0-472384b93a4a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.453082] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75c35a74-3e99-47b0-9528-9c04ccaffd63 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.460690] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c07ab34-15a2-44e1-8fed-216e6d903cfc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.490486] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180899MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 993.490559] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 993.490809] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 993.536395] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 993.536562] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 993.536695] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 670f3974-b332-48c2-9aab-6a9ed01731b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 993.548608] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 993.560247] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance e32c24e1-485d-48b9-827b-fceb6828510c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 993.571017] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 842426aa-72a3-4604-b50b-9705b55ea396 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 993.571276] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 993.571464] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 993.654693] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3cb3124-b6e3-40a1-9528-0031e492afde {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.662748] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab59f2ad-b56e-47c0-8a34-0601707f04e4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.693487] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4add705-327b-4913-b433-3b19fce089a0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.700975] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2344af56-6fd3-444b-8a6c-f073cb3809b8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.714783] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 993.722893] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 993.736992] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 993.737310] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 994.738316] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.738659] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.416656] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.417088] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 996.417088] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 996.429511] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 996.429682] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 996.429791] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 996.429918] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 996.430342] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.430516] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.430666] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.430798] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 997.417306] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 998.411567] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1014.876675] env[67144]: WARNING oslo_vmware.rw_handles [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1014.876675] env[67144]: ERROR oslo_vmware.rw_handles [ 1014.877456] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1014.878841] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1014.879095] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Copying Virtual Disk [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/48ff80ec-98fb-4c5d-b8c6-9231c9c735ef/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1014.879370] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-88a45a68-61cc-412e-adea-99bd43781dbe {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1014.886611] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for the task: (returnval){ [ 1014.886611] env[67144]: value = "task-2848091" [ 1014.886611] env[67144]: _type = "Task" [ 1014.886611] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1014.894114] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Task: {'id': task-2848091, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1015.397183] env[67144]: DEBUG oslo_vmware.exceptions [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1015.397425] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1015.397963] env[67144]: ERROR nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1015.397963] env[67144]: Faults: ['InvalidArgument'] [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Traceback (most recent call last): [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] yield resources [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self.driver.spawn(context, instance, image_meta, [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self._fetch_image_if_missing(context, vi) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] image_cache(vi, tmp_image_ds_loc) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] vm_util.copy_virtual_disk( [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] session._wait_for_task(vmdk_copy_task) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return self.wait_for_task(task_ref) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return evt.wait() [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] result = hub.switch() [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return self.greenlet.switch() [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self.f(*self.args, **self.kw) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] raise exceptions.translate_fault(task_info.error) [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Faults: ['InvalidArgument'] [ 1015.397963] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] [ 1015.398756] env[67144]: INFO nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Terminating instance [ 1015.399780] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1015.399984] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1015.400226] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34cc863f-ab56-466b-9d53-93c048669663 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.402381] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1015.402564] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1015.403294] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc01e61b-a534-4e4e-aa93-a6039fd5feb8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.409781] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1015.409992] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89196f3b-8f38-4dd6-b4a9-e3fbbea17a1d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.412072] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1015.412272] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1015.413171] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1628295e-6522-42a0-a543-3b68d2fba14f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.418088] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for the task: (returnval){ [ 1015.418088] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]527586f6-af20-de46-272c-aaff64a6346c" [ 1015.418088] env[67144]: _type = "Task" [ 1015.418088] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1015.426108] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]527586f6-af20-de46-272c-aaff64a6346c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1015.479303] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1015.479523] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1015.479703] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Deleting the datastore file [datastore1] b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1015.479964] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c72f804d-aa04-4b28-9658-b206a4178bbf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.486899] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for the task: (returnval){ [ 1015.486899] env[67144]: value = "task-2848093" [ 1015.486899] env[67144]: _type = "Task" [ 1015.486899] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1015.495722] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Task: {'id': task-2848093, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1015.928721] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1015.929102] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Creating directory with path [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1015.929185] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f2bb3955-2c42-4e96-9c19-257d23767b48 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.940014] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Created directory with path [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1015.940208] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Fetch image to [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1015.940394] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1015.943056] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffc67c3a-5675-4e9e-a184-36fb5503af0b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.947642] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-871a1913-2bdd-4d7f-8eee-58668051f487 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.956946] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c41a6e-b01e-4612-9784-0ed5051e759c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.986630] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-688d2672-2244-4d3d-b7df-e9ccbd27c304 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.997412] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-05cbaee1-3120-4eeb-972f-b18183952ecb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1015.999057] env[67144]: DEBUG oslo_vmware.api [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Task: {'id': task-2848093, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075654} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1015.999293] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1015.999474] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1015.999646] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1015.999816] env[67144]: INFO nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1016.001903] env[67144]: DEBUG nova.compute.claims [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1016.002080] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1016.002295] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1016.020863] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1016.063867] env[67144]: DEBUG oslo_vmware.rw_handles [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1016.120660] env[67144]: DEBUG oslo_vmware.rw_handles [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1016.120849] env[67144]: DEBUG oslo_vmware.rw_handles [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1016.180193] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7975c7d1-8e5d-43fd-89ab-f0247119315e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.187662] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04c35abd-e09d-4ab8-86d7-a3ad9f0156d6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.218423] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58be4318-1a76-408d-8dc4-2a9ae5baa3de {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.225494] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f84f1569-2e0c-44a9-85d0-90f11f718c6e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.238368] env[67144]: DEBUG nova.compute.provider_tree [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1016.246643] env[67144]: DEBUG nova.scheduler.client.report [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1016.262377] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1016.262894] env[67144]: ERROR nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1016.262894] env[67144]: Faults: ['InvalidArgument'] [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Traceback (most recent call last): [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self.driver.spawn(context, instance, image_meta, [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self._fetch_image_if_missing(context, vi) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] image_cache(vi, tmp_image_ds_loc) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] vm_util.copy_virtual_disk( [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] session._wait_for_task(vmdk_copy_task) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return self.wait_for_task(task_ref) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return evt.wait() [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] result = hub.switch() [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] return self.greenlet.switch() [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] self.f(*self.args, **self.kw) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] raise exceptions.translate_fault(task_info.error) [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Faults: ['InvalidArgument'] [ 1016.262894] env[67144]: ERROR nova.compute.manager [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] [ 1016.263688] env[67144]: DEBUG nova.compute.utils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] VimFaultException {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1016.265337] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Build of instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 was re-scheduled: A specified parameter was not correct: fileType [ 1016.265337] env[67144]: Faults: ['InvalidArgument'] {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1016.265682] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1016.265849] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1016.266033] env[67144]: DEBUG nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1016.266201] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1016.589937] env[67144]: DEBUG nova.network.neutron [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1016.602371] env[67144]: INFO nova.compute.manager [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Took 0.34 seconds to deallocate network for instance. [ 1016.728370] env[67144]: INFO nova.scheduler.client.report [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Deleted allocations for instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 [ 1016.746015] env[67144]: DEBUG oslo_concurrency.lockutils [None req-682556ee-d737-4954-8d2c-995e437a97ab tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 393.600s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1016.747064] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 195.930s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1016.747287] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Acquiring lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1016.747494] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1016.747656] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1016.749820] env[67144]: INFO nova.compute.manager [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Terminating instance [ 1016.751584] env[67144]: DEBUG nova.compute.manager [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1016.751786] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1016.752457] env[67144]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aed7ef3e-b572-4f4e-80b7-cf288b9456fa {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.761292] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8a026d4-ef7b-49a4-985c-a70df12ccf00 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1016.773065] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1016.796393] env[67144]: WARNING nova.virt.vmwareapi.vmops [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9 could not be found. [ 1016.796393] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1016.796393] env[67144]: INFO nova.compute.manager [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1016.796393] env[67144]: DEBUG oslo.service.loopingcall [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1016.796393] env[67144]: DEBUG nova.compute.manager [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1016.796393] env[67144]: DEBUG nova.network.neutron [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1016.850296] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1016.850296] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1016.850296] env[67144]: INFO nova.compute.claims [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1016.858042] env[67144]: DEBUG nova.network.neutron [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1016.866506] env[67144]: INFO nova.compute.manager [-] [instance: b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9] Took 0.07 seconds to deallocate network for instance. [ 1017.010622] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f6278a-46ba-4bb6-8ca6-025dc85c93ec {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.020596] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aff6617e-3bf9-49dd-81fc-f803857dd7bc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.054172] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c973ca-19a8-4279-9430-86e6f67efb7a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.057792] env[67144]: DEBUG nova.compute.manager [req-fc4f69ea-18cc-4dd9-bcab-2de6757ae157 req-fb430a0a-08c1-4b38-a2a1-db5455e0e808 service nova] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Received event network-vif-deleted-c1570a75-1610-4f3f-bec4-98143f091678 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1017.060396] env[67144]: DEBUG oslo_concurrency.lockutils [None req-8d6ab1d3-2d38-4d44-8aa2-3e19699123bc tempest-ImagesNegativeTestJSON-722030986 tempest-ImagesNegativeTestJSON-722030986-project-member] Lock "b85f8cc3-b3d7-420e-8d1e-f636f2b85ca9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.313s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1017.067930] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e60e25ca-21e4-4994-a7b1-2720f48ef1dc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.087558] env[67144]: DEBUG nova.compute.provider_tree [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1017.095126] env[67144]: DEBUG nova.scheduler.client.report [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1017.114145] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1017.114344] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1017.168632] env[67144]: DEBUG nova.compute.utils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1017.169867] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1017.170611] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1017.200980] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1017.231802] env[67144]: DEBUG nova.policy [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb5a6058c1f544ee86156e01fee15d6a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e7c742bcb6645f5b45271e527224494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 1017.286935] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1017.311643] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1017.311893] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1017.312063] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1017.312265] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1017.312401] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1017.312546] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1017.312751] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1017.312911] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1017.313089] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1017.313287] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1017.313472] env[67144]: DEBUG nova.virt.hardware [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1017.314448] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd92a2a-17ee-4629-bdcf-4d5a6413b632 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.323052] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d68b12-812d-48bc-ad58-a9fb7267f22a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1017.544820] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Successfully created port: aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1018.382848] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Successfully updated port: aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1018.400807] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquiring lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1018.400994] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquired lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1018.401168] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1018.451418] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1018.632639] env[67144]: DEBUG nova.compute.manager [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Received event network-vif-plugged-aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1018.633078] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Acquiring lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1018.633452] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1018.633771] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1018.634056] env[67144]: DEBUG nova.compute.manager [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] No waiting events found dispatching network-vif-plugged-aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1018.634338] env[67144]: WARNING nova.compute.manager [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Received unexpected event network-vif-plugged-aa64313a-c0b7-48fe-abbc-991150174e34 for instance with vm_state building and task_state spawning. [ 1018.634613] env[67144]: DEBUG nova.compute.manager [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Received event network-changed-aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1018.634863] env[67144]: DEBUG nova.compute.manager [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Refreshing instance network info cache due to event network-changed-aa64313a-c0b7-48fe-abbc-991150174e34. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1018.635132] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Acquiring lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1018.725410] env[67144]: DEBUG nova.network.neutron [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Updating instance_info_cache with network_info: [{"id": "aa64313a-c0b7-48fe-abbc-991150174e34", "address": "fa:16:3e:3e:09:3f", "network": {"id": "3bd4a887-a8fe-4712-b3c9-aa5d06320eac", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1941444076-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e7c742bcb6645f5b45271e527224494", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa64313a-c0", "ovs_interfaceid": "aa64313a-c0b7-48fe-abbc-991150174e34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1018.751989] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Releasing lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1018.752351] env[67144]: DEBUG nova.compute.manager [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Instance network_info: |[{"id": "aa64313a-c0b7-48fe-abbc-991150174e34", "address": "fa:16:3e:3e:09:3f", "network": {"id": "3bd4a887-a8fe-4712-b3c9-aa5d06320eac", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1941444076-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e7c742bcb6645f5b45271e527224494", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa64313a-c0", "ovs_interfaceid": "aa64313a-c0b7-48fe-abbc-991150174e34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1018.752653] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Acquired lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1018.752834] env[67144]: DEBUG nova.network.neutron [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Refreshing network info cache for port aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1018.756939] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3e:09:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7edb7c08-2fae-4df5-9ec6-5ccf06d7e337', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aa64313a-c0b7-48fe-abbc-991150174e34', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1018.761266] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Creating folder: Project (9e7c742bcb6645f5b45271e527224494). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1018.762216] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be00a36c-91d6-4dd1-91b9-cc86f318750a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1018.775394] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Created folder: Project (9e7c742bcb6645f5b45271e527224494) in parent group-v572613. [ 1018.775607] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Creating folder: Instances. Parent ref: group-v572673. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1018.775834] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a3aa1238-6c03-4e06-90e4-bd7e85652e0d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1018.784662] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Created folder: Instances in parent group-v572673. [ 1018.784896] env[67144]: DEBUG oslo.service.loopingcall [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1018.785209] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1018.785306] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed201b32-94b8-4306-8903-ed4437794f94 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1018.804707] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1018.804707] env[67144]: value = "task-2848096" [ 1018.804707] env[67144]: _type = "Task" [ 1018.804707] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1018.817899] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848096, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1019.096353] env[67144]: DEBUG nova.network.neutron [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Updated VIF entry in instance network info cache for port aa64313a-c0b7-48fe-abbc-991150174e34. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1019.096709] env[67144]: DEBUG nova.network.neutron [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Updating instance_info_cache with network_info: [{"id": "aa64313a-c0b7-48fe-abbc-991150174e34", "address": "fa:16:3e:3e:09:3f", "network": {"id": "3bd4a887-a8fe-4712-b3c9-aa5d06320eac", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1941444076-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e7c742bcb6645f5b45271e527224494", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7edb7c08-2fae-4df5-9ec6-5ccf06d7e337", "external-id": "nsx-vlan-transportzone-309", "segmentation_id": 309, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa64313a-c0", "ovs_interfaceid": "aa64313a-c0b7-48fe-abbc-991150174e34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1019.105806] env[67144]: DEBUG oslo_concurrency.lockutils [req-06e9b219-9cc2-4513-89ca-4d69610138cb req-46e9a6ba-2e00-4f4e-ac29-545a5a62c50e service nova] Releasing lock "refresh_cache-3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1019.318067] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848096, 'name': CreateVM_Task, 'duration_secs': 0.290718} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1019.318067] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1019.318067] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1019.318067] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1019.318067] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1019.318067] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a86b59f9-7d8f-45ff-8250-2bb59f88d41a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.322452] env[67144]: DEBUG oslo_vmware.api [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Waiting for the task: (returnval){ [ 1019.322452] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]522e9bfd-02de-05f0-fd56-b0835f6d9e50" [ 1019.322452] env[67144]: _type = "Task" [ 1019.322452] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1019.331673] env[67144]: DEBUG oslo_vmware.api [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]522e9bfd-02de-05f0-fd56-b0835f6d9e50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1019.546604] env[67144]: DEBUG nova.compute.manager [req-c2652920-c7cb-4ba9-ae38-4cda6bce356c req-80212d06-6dc5-4591-954e-c6b1407532ed service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Received event network-vif-deleted-aa64313a-c0b7-48fe-abbc-991150174e34 {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1019.547037] env[67144]: INFO nova.compute.manager [req-c2652920-c7cb-4ba9-ae38-4cda6bce356c req-80212d06-6dc5-4591-954e-c6b1407532ed service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Neutron deleted interface aa64313a-c0b7-48fe-abbc-991150174e34; detaching it from the instance and deleting it from the info cache [ 1019.547037] env[67144]: DEBUG nova.network.neutron [req-c2652920-c7cb-4ba9-ae38-4cda6bce356c req-80212d06-6dc5-4591-954e-c6b1407532ed service nova] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1019.557434] env[67144]: DEBUG oslo_concurrency.lockutils [req-c2652920-c7cb-4ba9-ae38-4cda6bce356c req-80212d06-6dc5-4591-954e-c6b1407532ed service nova] Acquiring lock "3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1019.834215] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1019.836848] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1019.837249] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1054.417254] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.427529] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1054.427738] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1054.427906] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1054.428072] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1054.429141] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1255331c-2228-426d-ab41-42d9df3aaaf7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.437879] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd3d1755-4a46-411a-ba11-e62d5f0c2633 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.451397] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a64fd1-c7d3-4242-8595-ceecf22e6e60 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.457436] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab16a5c-43c0-4be2-8c5a-414d4474302d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.487390] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1054.487549] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1054.487719] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1054.524030] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance ca7b7941-c016-4968-9beb-f8c094ca16cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1054.533890] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 842426aa-72a3-4604-b50b-9705b55ea396 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1054.534123] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1054.534273] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1054.570198] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf46b43-e3fb-431b-b13a-3795450b896a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.578084] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2875c32c-e8bd-443f-adfe-c7e774b9b19d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.607224] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79a4e4fd-0bec-4fac-8e71-a677d41235ef {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.614104] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-764a38e1-70e3-4def-92ea-3e8116f185d6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.626911] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1054.634609] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1054.647662] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1054.647842] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1056.647725] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.648142] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1056.648142] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1056.658456] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.658619] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1056.658788] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.659184] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.659344] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.416748] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1058.411527] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1058.416145] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1058.416315] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1059.417451] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.412541] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.824210] env[67144]: WARNING oslo_vmware.rw_handles [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1064.824210] env[67144]: ERROR oslo_vmware.rw_handles [ 1064.824876] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1064.826970] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1064.827219] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Copying Virtual Disk [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/0d3dc579-da86-46ed-ae0d-ae4812e633dc/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1064.827508] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-624d9f68-8e81-4269-8a26-aba9b89b8dc6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1064.835449] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for the task: (returnval){ [ 1064.835449] env[67144]: value = "task-2848097" [ 1064.835449] env[67144]: _type = "Task" [ 1064.835449] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1064.843021] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Task: {'id': task-2848097, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1065.346481] env[67144]: DEBUG oslo_vmware.exceptions [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1065.346709] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1065.347267] env[67144]: ERROR nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1065.347267] env[67144]: Faults: ['InvalidArgument'] [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Traceback (most recent call last): [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] yield resources [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self.driver.spawn(context, instance, image_meta, [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self._fetch_image_if_missing(context, vi) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] image_cache(vi, tmp_image_ds_loc) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] vm_util.copy_virtual_disk( [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] session._wait_for_task(vmdk_copy_task) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return self.wait_for_task(task_ref) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return evt.wait() [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] result = hub.switch() [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return self.greenlet.switch() [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self.f(*self.args, **self.kw) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] raise exceptions.translate_fault(task_info.error) [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Faults: ['InvalidArgument'] [ 1065.347267] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] [ 1065.348307] env[67144]: INFO nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Terminating instance [ 1065.350145] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1065.350361] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1065.350990] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1065.351201] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1065.351422] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5fb4acf6-37e9-4972-9528-0588da9f43f5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.353883] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53b82dbc-83c2-4c8f-a99f-c241b6215789 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.360353] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1065.360563] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-002d0bbb-9747-4db5-a09c-cf3459af2bb0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.362653] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1065.362821] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1065.363734] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-694b0f71-b8f5-43b9-9d5d-26960aba78d0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.368785] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Waiting for the task: (returnval){ [ 1065.368785] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52eb11b6-992d-451b-75ac-d1f68da914ab" [ 1065.368785] env[67144]: _type = "Task" [ 1065.368785] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1065.375696] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52eb11b6-992d-451b-75ac-d1f68da914ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1065.433482] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1065.433730] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1065.433913] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Deleting the datastore file [datastore1] ca7b7941-c016-4968-9beb-f8c094ca16cd {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1065.434177] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7009b63f-bd8b-4396-9533-cea51b5d69e3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.440292] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for the task: (returnval){ [ 1065.440292] env[67144]: value = "task-2848099" [ 1065.440292] env[67144]: _type = "Task" [ 1065.440292] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1065.447497] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Task: {'id': task-2848099, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1065.879292] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1065.879632] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Creating directory with path [datastore1] vmware_temp/284cfa3a-e692-4727-bd95-f7fe6bd8eed6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1065.879742] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1c1615d1-b319-4cf4-935f-53f4f03e70ae {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.890930] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Created directory with path [datastore1] vmware_temp/284cfa3a-e692-4727-bd95-f7fe6bd8eed6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1065.891154] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Fetch image to [datastore1] vmware_temp/284cfa3a-e692-4727-bd95-f7fe6bd8eed6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1065.891292] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/284cfa3a-e692-4727-bd95-f7fe6bd8eed6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1065.891963] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d36dd44-56d2-4e97-9417-5dea66a2001c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.898335] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3bc5386-e77a-4647-9b95-f325be93bef1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.906856] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f96d8a7d-8fc9-4de8-a942-dec8cec47ca2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.937013] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deee930b-dca4-4846-b4d8-7938d7aaa2f9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.944854] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72f05929-b12b-4a7d-afe3-72186d4eb5de {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1065.950909] env[67144]: DEBUG oslo_vmware.api [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Task: {'id': task-2848099, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076724} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1065.951149] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1065.951324] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1065.951491] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1065.951668] env[67144]: INFO nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1065.953753] env[67144]: DEBUG nova.compute.claims [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1065.953926] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1065.954155] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1065.964999] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1066.027952] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2591bb5-64e7-415b-93b0-658cc879243a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.035071] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9df0338a-58b3-427b-947e-170e83d4c231 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.065354] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb94f95d-cf6f-4d2c-b15c-7daaafdf7b8b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.072390] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48813140-2661-4bfc-9ad7-e67abff98254 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.086118] env[67144]: DEBUG nova.compute.provider_tree [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1066.094041] env[67144]: DEBUG nova.scheduler.client.report [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1066.106405] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.152s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.106925] env[67144]: ERROR nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1066.106925] env[67144]: Faults: ['InvalidArgument'] [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Traceback (most recent call last): [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self.driver.spawn(context, instance, image_meta, [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self._fetch_image_if_missing(context, vi) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] image_cache(vi, tmp_image_ds_loc) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] vm_util.copy_virtual_disk( [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] session._wait_for_task(vmdk_copy_task) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return self.wait_for_task(task_ref) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return evt.wait() [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] result = hub.switch() [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] return self.greenlet.switch() [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] self.f(*self.args, **self.kw) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] raise exceptions.translate_fault(task_info.error) [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Faults: ['InvalidArgument'] [ 1066.106925] env[67144]: ERROR nova.compute.manager [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] [ 1066.107772] env[67144]: DEBUG nova.compute.utils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] VimFaultException {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1066.109241] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Build of instance ca7b7941-c016-4968-9beb-f8c094ca16cd was re-scheduled: A specified parameter was not correct: fileType [ 1066.109241] env[67144]: Faults: ['InvalidArgument'] {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1066.109623] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1066.109792] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1066.109944] env[67144]: DEBUG nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1066.110128] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1066.168186] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1066.169675] env[67144]: ERROR nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] result = getattr(controller, method)(*args, **kwargs) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._get(image_id) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] resp, body = self.http_client.get(url, headers=header) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.request(url, 'GET', **kwargs) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._handle_response(resp) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exc.from_response(resp, resp.content) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] During handling of the above exception, another exception occurred: [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] yield resources [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.driver.spawn(context, instance, image_meta, [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._fetch_image_if_missing(context, vi) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image_fetch(context, vi, tmp_image_ds_loc) [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] images.fetch_image( [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1066.169675] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] metadata = IMAGE_API.get(context, image_ref) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return session.show(context, image_id, [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] _reraise_translated_image_exception(image_id) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise new_exc.with_traceback(exc_trace) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] result = getattr(controller, method)(*args, **kwargs) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._get(image_id) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] resp, body = self.http_client.get(url, headers=header) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.request(url, 'GET', **kwargs) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._handle_response(resp) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exc.from_response(resp, resp.content) [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1066.170611] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1066.170611] env[67144]: INFO nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Terminating instance [ 1066.171452] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1066.171667] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1066.171905] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ca919ad-8c64-47c5-9b2f-c3bb8055f1d0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.175138] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1066.175368] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1066.176158] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22e8ba6d-fa94-43ca-b49f-b7ac7985ee03 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.180830] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1066.181015] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1066.183472] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-475d8261-40d8-4a9e-b7cf-45addd0b7f75 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.185612] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1066.185824] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e95d9863-04af-473c-99fc-95dbab8b75c0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.189970] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Waiting for the task: (returnval){ [ 1066.189970] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52789f39-6b34-6129-bacf-155b49919a8c" [ 1066.189970] env[67144]: _type = "Task" [ 1066.189970] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1066.197630] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52789f39-6b34-6129-bacf-155b49919a8c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1066.260743] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1066.260963] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1066.261166] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Deleting the datastore file [datastore1] d4eaa8fd-84b5-47a2-832a-9106187bc531 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1066.261423] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8045ffb3-0dd7-4015-bd44-94126e4fd18e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.267557] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Waiting for the task: (returnval){ [ 1066.267557] env[67144]: value = "task-2848101" [ 1066.267557] env[67144]: _type = "Task" [ 1066.267557] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1066.275165] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Task: {'id': task-2848101, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1066.431925] env[67144]: DEBUG nova.network.neutron [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1066.442028] env[67144]: INFO nova.compute.manager [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Took 0.33 seconds to deallocate network for instance. [ 1066.523433] env[67144]: INFO nova.scheduler.client.report [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Deleted allocations for instance ca7b7941-c016-4968-9beb-f8c094ca16cd [ 1066.539101] env[67144]: DEBUG oslo_concurrency.lockutils [None req-2b7b8671-d6a5-4d5d-b15f-ec954ca8c3a8 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 436.704s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.540155] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 239.715s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1066.540373] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Acquiring lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1066.540578] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1066.540745] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.542607] env[67144]: INFO nova.compute.manager [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Terminating instance [ 1066.546110] env[67144]: DEBUG nova.compute.manager [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1066.546272] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1066.546938] env[67144]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-76dbdbe3-404a-423c-9f12-cd480aa12695 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.552879] env[67144]: DEBUG nova.compute.manager [None req-6209bf46-9f50-4da5-acbd-db94347a1282 tempest-ServersTestJSON-1067565229 tempest-ServersTestJSON-1067565229-project-member] [instance: e32c24e1-485d-48b9-827b-fceb6828510c] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1066.558709] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-767c11ee-c1a6-427b-8f28-bc0c8c44f698 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.587304] env[67144]: WARNING nova.virt.vmwareapi.vmops [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ca7b7941-c016-4968-9beb-f8c094ca16cd could not be found. [ 1066.587507] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1066.587683] env[67144]: INFO nova.compute.manager [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1066.587950] env[67144]: DEBUG oslo.service.loopingcall [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1066.588347] env[67144]: DEBUG nova.compute.manager [None req-6209bf46-9f50-4da5-acbd-db94347a1282 tempest-ServersTestJSON-1067565229 tempest-ServersTestJSON-1067565229-project-member] [instance: e32c24e1-485d-48b9-827b-fceb6828510c] Instance disappeared before build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1066.589198] env[67144]: DEBUG nova.compute.manager [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1066.589303] env[67144]: DEBUG nova.network.neutron [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1066.607165] env[67144]: DEBUG oslo_concurrency.lockutils [None req-6209bf46-9f50-4da5-acbd-db94347a1282 tempest-ServersTestJSON-1067565229 tempest-ServersTestJSON-1067565229-project-member] Lock "e32c24e1-485d-48b9-827b-fceb6828510c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.009s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.610974] env[67144]: DEBUG nova.network.neutron [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1066.616147] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Starting instance... {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1066.618600] env[67144]: INFO nova.compute.manager [-] [instance: ca7b7941-c016-4968-9beb-f8c094ca16cd] Took 0.03 seconds to deallocate network for instance. [ 1066.660945] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1066.661254] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1066.662643] env[67144]: INFO nova.compute.claims [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1066.702871] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1066.703938] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Creating directory with path [datastore1] vmware_temp/d40faed7-6fac-485a-81d5-de39f847f6a2/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1066.703938] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5b6deed4-fcad-4a63-b053-183ffa62a1b3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.714880] env[67144]: DEBUG oslo_concurrency.lockutils [None req-e3a2760c-890e-4317-ac6b-c82f699d44d4 tempest-ServerExternalEventsTest-1958997424 tempest-ServerExternalEventsTest-1958997424-project-member] Lock "ca7b7941-c016-4968-9beb-f8c094ca16cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.716666] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Created directory with path [datastore1] vmware_temp/d40faed7-6fac-485a-81d5-de39f847f6a2/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1066.716854] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Fetch image to [datastore1] vmware_temp/d40faed7-6fac-485a-81d5-de39f847f6a2/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1066.717036] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/d40faed7-6fac-485a-81d5-de39f847f6a2/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1066.718059] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdd723eb-2c51-4059-b984-8f7c4d4c7c28 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.727834] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d22d7b66-cbec-49e9-974b-4149c6569349 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.738494] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43e6bb1d-9f77-4ecd-b6f1-458cb9944a63 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.742186] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-920db7ca-645c-4a70-a3d0-e58491c4bf82 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.772783] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7633eb3a-9f9f-4a84-8afb-4e633c1c6617 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.779320] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70586f66-3659-4b3b-b853-385ca271e32a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.788078] env[67144]: DEBUG oslo_vmware.api [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Task: {'id': task-2848101, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066669} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1066.813487] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1066.813684] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1066.813860] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1066.814067] env[67144]: INFO nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1066.816652] env[67144]: DEBUG nova.compute.claims [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1066.816817] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1066.817515] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c714ffd8-37b4-47d9-aa26-a40a5e194c65 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.821594] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-526ab575-9d40-4b31-a075-45c515d5dccb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.826042] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5483120-7e91-4cae-ad02-f52050e535b8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.839059] env[67144]: DEBUG nova.compute.provider_tree [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1066.843868] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1066.847646] env[67144]: DEBUG nova.scheduler.client.report [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1066.860096] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.860568] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Start building networks asynchronously for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1066.862759] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.046s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1066.885125] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.022s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1066.885768] env[67144]: DEBUG nova.compute.utils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1066.887111] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1066.887283] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1066.887443] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1066.887613] env[67144]: DEBUG nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1066.887786] env[67144]: DEBUG nova.network.neutron [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1066.906858] env[67144]: DEBUG nova.compute.utils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Using /dev/sd instead of None {{(pid=67144) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1066.908262] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Allocating IP information in the background. {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1066.908422] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] allocate_for_instance() {{(pid=67144) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1066.916468] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Start building block device mappings for instance. {{(pid=67144) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1066.952996] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1066.953788] env[67144]: ERROR nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] result = getattr(controller, method)(*args, **kwargs) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._get(image_id) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] resp, body = self.http_client.get(url, headers=header) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.request(url, 'GET', **kwargs) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._handle_response(resp) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exc.from_response(resp, resp.content) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] During handling of the above exception, another exception occurred: [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] yield resources [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.driver.spawn(context, instance, image_meta, [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._fetch_image_if_missing(context, vi) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image_fetch(context, vi, tmp_image_ds_loc) [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] images.fetch_image( [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1066.953788] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] metadata = IMAGE_API.get(context, image_ref) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return session.show(context, image_id, [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] _reraise_translated_image_exception(image_id) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise new_exc.with_traceback(exc_trace) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] result = getattr(controller, method)(*args, **kwargs) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._get(image_id) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] resp, body = self.http_client.get(url, headers=header) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.request(url, 'GET', **kwargs) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._handle_response(resp) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exc.from_response(resp, resp.content) [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1066.954908] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1066.954908] env[67144]: INFO nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Terminating instance [ 1066.955964] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1066.956226] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1066.956773] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1066.956969] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1066.957812] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f09d4ba-0780-4813-bc00-e4b6a103046d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.961054] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ed3d1fd5-d96d-4e9b-9282-a68bd7e0dd08 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.969677] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1066.970691] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6af9ba2e-4902-4e03-b648-67e05225b62d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.972138] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1066.972314] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1066.972966] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc606734-4283-4c71-a417-92b6484ef459 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1066.978150] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Waiting for the task: (returnval){ [ 1066.978150] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52643f32-f67f-13e6-2485-531ff25fc39f" [ 1066.978150] env[67144]: _type = "Task" [ 1066.978150] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1066.982093] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Start spawning the instance on the hypervisor. {{(pid=67144) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1066.988862] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]52643f32-f67f-13e6-2485-531ff25fc39f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1067.002560] env[67144]: DEBUG nova.policy [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9815601439045639479615912d4cecb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3902eab35278495e87c590a781241e16', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67144) authorize /opt/stack/nova/nova/policy.py:203}} [ 1067.031928] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:32:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:32:26Z,direct_url=,disk_format='vmdk',id=0a8f8f2e-82dd-4c4f-80fe-9515de315a84,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='d66969059be64e7d86646b564fe28c7d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:32:27Z,virtual_size=,visibility=), allow threads: False {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1067.031928] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Flavor limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1067.031928] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Image limits 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1067.032280] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Flavor pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1067.032280] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Image pref 0:0:0 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1067.032448] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67144) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1067.032491] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1067.032655] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1067.032819] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Got 1 possible topologies {{(pid=67144) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1067.032979] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1067.033163] env[67144]: DEBUG nova.virt.hardware [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67144) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1067.034014] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f8bc6bd-7a22-4e14-8f13-10ce072ae19e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.037120] env[67144]: DEBUG neutronclient.v2_0.client [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1067.038709] env[67144]: ERROR nova.compute.manager [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] result = getattr(controller, method)(*args, **kwargs) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._get(image_id) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] resp, body = self.http_client.get(url, headers=header) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.request(url, 'GET', **kwargs) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._handle_response(resp) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exc.from_response(resp, resp.content) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] During handling of the above exception, another exception occurred: [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.driver.spawn(context, instance, image_meta, [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._fetch_image_if_missing(context, vi) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image_fetch(context, vi, tmp_image_ds_loc) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] images.fetch_image( [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] metadata = IMAGE_API.get(context, image_ref) [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return session.show(context, image_id, [ 1067.038709] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] _reraise_translated_image_exception(image_id) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise new_exc.with_traceback(exc_trace) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] result = getattr(controller, method)(*args, **kwargs) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._get(image_id) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] resp, body = self.http_client.get(url, headers=header) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.request(url, 'GET', **kwargs) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._handle_response(resp) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exc.from_response(resp, resp.content) [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] During handling of the above exception, another exception occurred: [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._build_and_run_instance(context, instance, image, [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] with excutils.save_and_reraise_exception(): [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.force_reraise() [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise self.value [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] with self.rt.instance_claim(context, instance, node, allocs, [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.abort() [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1067.039807] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return f(*args, **kwargs) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._unset_instance_host_and_node(instance) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] instance.save() [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] updates, result = self.indirection_api.object_action( [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return cctxt.call(context, 'object_action', objinst=objinst, [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] result = self.transport._send( [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._driver.send(target, ctxt, message, [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise result [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] nova.exception_Remote.InstanceNotFound_Remote: Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 could not be found. [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return getattr(target, method)(*args, **kwargs) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return fn(self, *args, **kwargs) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] old_ref, inst_ref = db.instance_update_and_get_original( [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return f(*args, **kwargs) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] with excutils.save_and_reraise_exception() as ectxt: [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.force_reraise() [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise self.value [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return f(*args, **kwargs) [ 1067.040957] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return f(context, *args, **kwargs) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exception.InstanceNotFound(instance_id=uuid) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] nova.exception.InstanceNotFound: Instance d4eaa8fd-84b5-47a2-832a-9106187bc531 could not be found. [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] During handling of the above exception, another exception occurred: [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] exception_handler_v20(status_code, error_body) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise client_exc(message=error_message, [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Neutron server returns request_ids: ['req-37b0ffa7-e139-4c6e-9dbf-1eb3ae405ecc'] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] During handling of the above exception, another exception occurred: [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] Traceback (most recent call last): [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._deallocate_network(context, instance, requested_networks) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self.network_api.deallocate_for_instance( [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] data = neutron.list_ports(**search_opts) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.list('ports', self.ports_path, retrieve_all, [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1067.042158] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] for r in self._pagination(collection, path, **params): [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] res = self.get(path, params=params) [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.retry_request("GET", action, body=body, [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] return self.do_request(method, action, body=body, [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] ret = obj(*args, **kwargs) [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] self._handle_fault_response(status_code, replybody, resp) [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] raise exception.Unauthorized() [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] nova.exception.Unauthorized: Not authorized. [ 1067.043272] env[67144]: ERROR nova.compute.manager [instance: d4eaa8fd-84b5-47a2-832a-9106187bc531] [ 1067.045219] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a6e2ef8-2b6c-43c9-81ee-ad6c6255f9b7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.068799] env[67144]: DEBUG oslo_concurrency.lockutils [None req-5ca08904-b61b-4443-93f9-0999f91c5459 tempest-ServerTagsTestJSON-1428668573 tempest-ServerTagsTestJSON-1428668573-project-member] Lock "d4eaa8fd-84b5-47a2-832a-9106187bc531" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 363.366s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.071200] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1067.071399] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1067.071574] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Deleting the datastore file [datastore1] f61f525f-70a5-402f-bf52-0bd4041b907f {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1067.071862] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9a4bf1fc-edb4-47ab-a6a7-58543fb6779c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.077509] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Waiting for the task: (returnval){ [ 1067.077509] env[67144]: value = "task-2848103" [ 1067.077509] env[67144]: _type = "Task" [ 1067.077509] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1067.085219] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Task: {'id': task-2848103, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1067.300573] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Successfully created port: 2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1067.488878] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1067.489107] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Creating directory with path [datastore1] vmware_temp/9f0593f2-5f6f-4968-a983-46fe97869792/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1067.489333] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ecf2cd63-3faa-4f85-b42a-73be62b7150b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.501090] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Created directory with path [datastore1] vmware_temp/9f0593f2-5f6f-4968-a983-46fe97869792/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1067.501293] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Fetch image to [datastore1] vmware_temp/9f0593f2-5f6f-4968-a983-46fe97869792/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1067.501463] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/9f0593f2-5f6f-4968-a983-46fe97869792/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1067.502236] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69401a66-050c-4350-9121-05a417ea72cf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.509015] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86d2b315-52d5-4e4d-8ebe-067fd4eff9c6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.517841] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a2445e4-0121-495b-981b-8ce0c2826008 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.549038] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-208361d1-455b-478b-85a4-df8ad3fa0dca {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.554402] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9ce0ada5-b290-4f3f-9e92-3d314a6bd37c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.573415] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1067.585290] env[67144]: DEBUG oslo_vmware.api [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Task: {'id': task-2848103, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072492} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1067.585525] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1067.585743] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1067.585921] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1067.586153] env[67144]: INFO nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1067.588142] env[67144]: DEBUG nova.compute.claims [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1067.588318] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1067.588525] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1067.613739] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.614429] env[67144]: DEBUG nova.compute.utils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance f61f525f-70a5-402f-bf52-0bd4041b907f could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1067.615817] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1067.615967] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1067.616137] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1067.616296] env[67144]: DEBUG nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1067.616453] env[67144]: DEBUG nova.network.neutron [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1067.669763] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1067.670172] env[67144]: ERROR nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] result = getattr(controller, method)(*args, **kwargs) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._get(image_id) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] resp, body = self.http_client.get(url, headers=header) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.request(url, 'GET', **kwargs) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._handle_response(resp) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exc.from_response(resp, resp.content) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] During handling of the above exception, another exception occurred: [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] yield resources [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.driver.spawn(context, instance, image_meta, [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._fetch_image_if_missing(context, vi) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image_fetch(context, vi, tmp_image_ds_loc) [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] images.fetch_image( [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1067.670172] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] metadata = IMAGE_API.get(context, image_ref) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return session.show(context, image_id, [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] _reraise_translated_image_exception(image_id) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise new_exc.with_traceback(exc_trace) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] result = getattr(controller, method)(*args, **kwargs) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._get(image_id) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] resp, body = self.http_client.get(url, headers=header) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.request(url, 'GET', **kwargs) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._handle_response(resp) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exc.from_response(resp, resp.content) [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1067.671434] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1067.671434] env[67144]: INFO nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Terminating instance [ 1067.672649] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1067.672649] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1067.673751] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7339de0-af74-4045-b0bb-e79bb3ffaaf9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.676074] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1067.676268] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1067.677201] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-519de706-4c2f-4839-9931-1d8560da5cd8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.684609] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1067.684819] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-094a279d-42ab-4069-8c60-307643e5fb9e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.687055] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1067.687235] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1067.688148] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31b7abb2-aa45-413c-b831-0e560781f9cb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.693173] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Waiting for the task: (returnval){ [ 1067.693173] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5247761e-f304-a2aa-66aa-039f5f9f1a98" [ 1067.693173] env[67144]: _type = "Task" [ 1067.693173] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1067.700543] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5247761e-f304-a2aa-66aa-039f5f9f1a98, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1067.744211] env[67144]: DEBUG neutronclient.v2_0.client [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1067.745857] env[67144]: ERROR nova.compute.manager [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] result = getattr(controller, method)(*args, **kwargs) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._get(image_id) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] resp, body = self.http_client.get(url, headers=header) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.request(url, 'GET', **kwargs) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._handle_response(resp) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exc.from_response(resp, resp.content) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] During handling of the above exception, another exception occurred: [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.driver.spawn(context, instance, image_meta, [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._fetch_image_if_missing(context, vi) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image_fetch(context, vi, tmp_image_ds_loc) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] images.fetch_image( [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] metadata = IMAGE_API.get(context, image_ref) [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return session.show(context, image_id, [ 1067.745857] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] _reraise_translated_image_exception(image_id) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise new_exc.with_traceback(exc_trace) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] result = getattr(controller, method)(*args, **kwargs) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._get(image_id) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] resp, body = self.http_client.get(url, headers=header) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.request(url, 'GET', **kwargs) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._handle_response(resp) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exc.from_response(resp, resp.content) [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] During handling of the above exception, another exception occurred: [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._build_and_run_instance(context, instance, image, [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] with excutils.save_and_reraise_exception(): [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.force_reraise() [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise self.value [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] with self.rt.instance_claim(context, instance, node, allocs, [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.abort() [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1067.746902] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return f(*args, **kwargs) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._unset_instance_host_and_node(instance) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] instance.save() [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] updates, result = self.indirection_api.object_action( [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] result = self.transport._send( [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._driver.send(target, ctxt, message, [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise result [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] nova.exception_Remote.InstanceNotFound_Remote: Instance f61f525f-70a5-402f-bf52-0bd4041b907f could not be found. [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return getattr(target, method)(*args, **kwargs) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return fn(self, *args, **kwargs) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return f(*args, **kwargs) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] with excutils.save_and_reraise_exception() as ectxt: [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.force_reraise() [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise self.value [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return f(*args, **kwargs) [ 1067.748200] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return f(context, *args, **kwargs) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exception.InstanceNotFound(instance_id=uuid) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] nova.exception.InstanceNotFound: Instance f61f525f-70a5-402f-bf52-0bd4041b907f could not be found. [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] During handling of the above exception, another exception occurred: [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] exception_handler_v20(status_code, error_body) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise client_exc(message=error_message, [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Neutron server returns request_ids: ['req-3b30a420-98a9-4994-a25d-da4611474214'] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] During handling of the above exception, another exception occurred: [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Traceback (most recent call last): [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._deallocate_network(context, instance, requested_networks) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self.network_api.deallocate_for_instance( [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] data = neutron.list_ports(**search_opts) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.list('ports', self.ports_path, retrieve_all, [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1067.750456] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] for r in self._pagination(collection, path, **params): [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] res = self.get(path, params=params) [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.retry_request("GET", action, body=body, [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] return self.do_request(method, action, body=body, [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] ret = obj(*args, **kwargs) [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] self._handle_fault_response(status_code, replybody, resp) [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] raise exception.Unauthorized() [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] nova.exception.Unauthorized: Not authorized. [ 1067.752870] env[67144]: ERROR nova.compute.manager [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] [ 1067.754503] env[67144]: DEBUG nova.compute.manager [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Received event network-vif-plugged-2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1067.754717] env[67144]: DEBUG oslo_concurrency.lockutils [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] Acquiring lock "842426aa-72a3-4604-b50b-9705b55ea396-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1067.754927] env[67144]: DEBUG oslo_concurrency.lockutils [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] Lock "842426aa-72a3-4604-b50b-9705b55ea396-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1067.755103] env[67144]: DEBUG oslo_concurrency.lockutils [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] Lock "842426aa-72a3-4604-b50b-9705b55ea396-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.755263] env[67144]: DEBUG nova.compute.manager [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] No waiting events found dispatching network-vif-plugged-2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1067.755420] env[67144]: WARNING nova.compute.manager [req-f3ab8f85-40d3-4b74-bd8d-62c8962f3f87 req-530634e5-995a-473e-9729-5cf746ec2767 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Received unexpected event network-vif-plugged-2e172af0-911b-4289-9b23-86e83386c66e for instance with vm_state building and task_state spawning. [ 1067.760196] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1067.760579] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1067.760579] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Deleting the datastore file [datastore1] b1bba9da-84f7-4d67-8ad6-af7cb429dd9c {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1067.761302] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1e2a8b7c-5523-497a-a73b-ca2af847b05d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.770891] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Waiting for the task: (returnval){ [ 1067.770891] env[67144]: value = "task-2848105" [ 1067.770891] env[67144]: _type = "Task" [ 1067.770891] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1067.776989] env[67144]: DEBUG oslo_concurrency.lockutils [None req-b992e117-ffda-45e3-be39-5d0dc9f0e0f1 tempest-ServersTestMultiNic-1383773212 tempest-ServersTestMultiNic-1383773212-project-member] Lock "f61f525f-70a5-402f-bf52-0bd4041b907f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 366.713s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.776989] env[67144]: DEBUG oslo_concurrency.lockutils [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] Acquired lock "f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1067.780043] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea50304-7d85-4540-ba95-9546c6bf7178 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.782191] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Task: {'id': task-2848105, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1067.787784] env[67144]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1067.787999] env[67144]: DEBUG oslo_vmware.api [-] Fault list: [ManagedObjectNotFound] {{(pid=67144) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1067.788400] env[67144]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4d5b7669-2c82-42e1-9841-18498710abbb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.796727] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fad08cf-3ba3-46b4-b650-d1e2e273b40c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1067.824861] env[67144]: ERROR root [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] Original exception being dropped: ['Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 377, in request_handler\n response = request(managed_object, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 586, in __call__\n return client.invoke(args, kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 728, in invoke\n result = self.send(soapenv, timeout=timeout)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 777, in send\n return self.process_reply(reply.message, None, None)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 840, in process_reply\n raise WebFault(fault, replyroot)\n', "suds.WebFault: Server raised fault: 'The object 'vim.VirtualMachine:vm-572654' has already been deleted or has not been completely created'\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 301, in _invoke_api\n return api_method(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 480, in get_object_property\n props = get_object_properties(vim, moref, [property_name],\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 360, in get_object_properties\n retrieve_result = vim.RetrievePropertiesEx(\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 413, in request_handler\n raise exceptions.VimFaultException(fault_list, fault_string,\n', "oslo_vmware.exceptions.VimFaultException: The object 'vim.VirtualMachine:vm-572654' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-572654' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-572654'}\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 123, in _call_method\n return self.invoke_api(module, method, self.vim, *args,\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 358, in invoke_api\n return _invoke_api(module, method, *args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 122, in func\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 122, in _inner\n idle = self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 96, in _func\n result = f(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 341, in _invoke_api\n raise clazz(str(excep),\n', "oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-572654' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-572654' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-572654'}\n"]: nova.exception.InstanceNotFound: Instance f61f525f-70a5-402f-bf52-0bd4041b907f could not be found. [ 1067.825092] env[67144]: DEBUG oslo_concurrency.lockutils [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] Releasing lock "f61f525f-70a5-402f-bf52-0bd4041b907f" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1067.825340] env[67144]: DEBUG nova.compute.manager [req-de152ec7-26e9-424c-aaf0-8b291ef4bfe3 req-8e5a339d-9706-4e43-bf2d-c0447dea0dbb service nova] [instance: f61f525f-70a5-402f-bf52-0bd4041b907f] Detach interface failed, port_id=44714ba6-ad01-48a3-bfe7-d65dc34dd361, reason: Instance f61f525f-70a5-402f-bf52-0bd4041b907f could not be found. {{(pid=67144) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10838}} [ 1067.886699] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Successfully updated port: 2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1067.896799] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquiring lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1067.896949] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquired lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1067.897110] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1067.949670] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1068.111597] env[67144]: DEBUG nova.network.neutron [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Updating instance_info_cache with network_info: [{"id": "2e172af0-911b-4289-9b23-86e83386c66e", "address": "fa:16:3e:28:28:2c", "network": {"id": "0fa1e319-a38f-41b9-868f-3f758185cf0c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-525376206-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3902eab35278495e87c590a781241e16", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2c06e3c2-8edb-4cf0-be6b-45dfe059c00b", "external-id": "nsx-vlan-transportzone-264", "segmentation_id": 264, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e172af0-91", "ovs_interfaceid": "2e172af0-911b-4289-9b23-86e83386c66e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1068.125160] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Releasing lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1068.125439] env[67144]: DEBUG nova.compute.manager [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Instance network_info: |[{"id": "2e172af0-911b-4289-9b23-86e83386c66e", "address": "fa:16:3e:28:28:2c", "network": {"id": "0fa1e319-a38f-41b9-868f-3f758185cf0c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-525376206-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3902eab35278495e87c590a781241e16", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2c06e3c2-8edb-4cf0-be6b-45dfe059c00b", "external-id": "nsx-vlan-transportzone-264", "segmentation_id": 264, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e172af0-91", "ovs_interfaceid": "2e172af0-911b-4289-9b23-86e83386c66e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67144) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1068.125900] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:28:28:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2c06e3c2-8edb-4cf0-be6b-45dfe059c00b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2e172af0-911b-4289-9b23-86e83386c66e', 'vif_model': 'vmxnet3'}] {{(pid=67144) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1068.133306] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Creating folder: Project (3902eab35278495e87c590a781241e16). Parent ref: group-v572613. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.133775] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-17002802-643d-491e-96bd-39d004af3090 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.145096] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Created folder: Project (3902eab35278495e87c590a781241e16) in parent group-v572613. [ 1068.145266] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Creating folder: Instances. Parent ref: group-v572676. {{(pid=67144) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.145462] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4504922e-3718-4664-8dee-49dc8df7b226 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.153928] env[67144]: INFO nova.virt.vmwareapi.vm_util [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Created folder: Instances in parent group-v572676. [ 1068.154150] env[67144]: DEBUG oslo.service.loopingcall [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1068.154313] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Creating VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1068.154486] env[67144]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0fe790ff-9287-45b1-96e7-afaf1d1d60b7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.173277] env[67144]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1068.173277] env[67144]: value = "task-2848108" [ 1068.173277] env[67144]: _type = "Task" [ 1068.173277] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1068.182509] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848108, 'name': CreateVM_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1068.201579] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1068.201816] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Creating directory with path [datastore1] vmware_temp/2c54b1ec-ee9d-40b1-9bae-76326f95e2c1/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1068.202020] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0d002ef-5008-4f6f-9160-55a3473a459c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.212415] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Created directory with path [datastore1] vmware_temp/2c54b1ec-ee9d-40b1-9bae-76326f95e2c1/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1068.212600] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Fetch image to [datastore1] vmware_temp/2c54b1ec-ee9d-40b1-9bae-76326f95e2c1/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1068.212770] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/2c54b1ec-ee9d-40b1-9bae-76326f95e2c1/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1068.213463] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17044ab6-3fe4-4ce8-9737-4966547ea869 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.220840] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ede5284d-e5cb-43fa-8f94-2c77776e77aa {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.230607] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5260c68-84ac-4452-8f66-4e7e5b16a7fe {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.262058] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c284e6c6-f744-4141-9fda-b6468a75b6e9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.270312] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-091aff07-a7bf-4fd9-8d1c-54bf1be0877b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.280199] env[67144]: DEBUG oslo_vmware.api [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Task: {'id': task-2848105, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080471} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1068.280426] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1068.280605] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1068.280782] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1068.280933] env[67144]: INFO nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1068.283277] env[67144]: DEBUG nova.compute.claims [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1068.283517] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1068.283742] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1068.292627] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1068.308334] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.309037] env[67144]: DEBUG nova.compute.utils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1068.310476] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1068.310648] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1068.310813] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1068.311013] env[67144]: DEBUG nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1068.311201] env[67144]: DEBUG nova.network.neutron [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1068.336220] env[67144]: DEBUG neutronclient.v2_0.client [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1068.337740] env[67144]: ERROR nova.compute.manager [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] result = getattr(controller, method)(*args, **kwargs) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._get(image_id) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] resp, body = self.http_client.get(url, headers=header) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.request(url, 'GET', **kwargs) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._handle_response(resp) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exc.from_response(resp, resp.content) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] During handling of the above exception, another exception occurred: [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.driver.spawn(context, instance, image_meta, [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._fetch_image_if_missing(context, vi) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image_fetch(context, vi, tmp_image_ds_loc) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] images.fetch_image( [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] metadata = IMAGE_API.get(context, image_ref) [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return session.show(context, image_id, [ 1068.337740] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] _reraise_translated_image_exception(image_id) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise new_exc.with_traceback(exc_trace) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] result = getattr(controller, method)(*args, **kwargs) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._get(image_id) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] resp, body = self.http_client.get(url, headers=header) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.request(url, 'GET', **kwargs) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._handle_response(resp) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exc.from_response(resp, resp.content) [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] During handling of the above exception, another exception occurred: [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._build_and_run_instance(context, instance, image, [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] with excutils.save_and_reraise_exception(): [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.force_reraise() [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise self.value [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] with self.rt.instance_claim(context, instance, node, allocs, [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.abort() [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1068.338811] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return f(*args, **kwargs) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._unset_instance_host_and_node(instance) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] instance.save() [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] updates, result = self.indirection_api.object_action( [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return cctxt.call(context, 'object_action', objinst=objinst, [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] result = self.transport._send( [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._driver.send(target, ctxt, message, [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise result [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] nova.exception_Remote.InstanceNotFound_Remote: Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c could not be found. [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return getattr(target, method)(*args, **kwargs) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return fn(self, *args, **kwargs) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] old_ref, inst_ref = db.instance_update_and_get_original( [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return f(*args, **kwargs) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] with excutils.save_and_reraise_exception() as ectxt: [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.force_reraise() [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise self.value [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return f(*args, **kwargs) [ 1068.339857] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return f(context, *args, **kwargs) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exception.InstanceNotFound(instance_id=uuid) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] nova.exception.InstanceNotFound: Instance b1bba9da-84f7-4d67-8ad6-af7cb429dd9c could not be found. [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] During handling of the above exception, another exception occurred: [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] exception_handler_v20(status_code, error_body) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise client_exc(message=error_message, [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Neutron server returns request_ids: ['req-fe60ef13-4239-40ba-9b74-71d3cd33f3c6'] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] During handling of the above exception, another exception occurred: [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] Traceback (most recent call last): [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._deallocate_network(context, instance, requested_networks) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self.network_api.deallocate_for_instance( [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] data = neutron.list_ports(**search_opts) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.list('ports', self.ports_path, retrieve_all, [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1068.341052] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] for r in self._pagination(collection, path, **params): [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] res = self.get(path, params=params) [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.retry_request("GET", action, body=body, [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] return self.do_request(method, action, body=body, [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] ret = obj(*args, **kwargs) [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] self._handle_fault_response(status_code, replybody, resp) [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] raise exception.Unauthorized() [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] nova.exception.Unauthorized: Not authorized. [ 1068.342182] env[67144]: ERROR nova.compute.manager [instance: b1bba9da-84f7-4d67-8ad6-af7cb429dd9c] [ 1068.357812] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ac8e2cc2-704d-4397-aab1-5b1d6ce559d6 tempest-ServerAddressesTestJSON-52359226 tempest-ServerAddressesTestJSON-52359226-project-member] Lock "b1bba9da-84f7-4d67-8ad6-af7cb429dd9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 364.376s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.389876] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1068.390680] env[67144]: ERROR nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] result = getattr(controller, method)(*args, **kwargs) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._get(image_id) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] resp, body = self.http_client.get(url, headers=header) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.request(url, 'GET', **kwargs) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._handle_response(resp) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exc.from_response(resp, resp.content) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] During handling of the above exception, another exception occurred: [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] yield resources [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.driver.spawn(context, instance, image_meta, [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._fetch_image_if_missing(context, vi) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image_fetch(context, vi, tmp_image_ds_loc) [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] images.fetch_image( [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1068.390680] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] metadata = IMAGE_API.get(context, image_ref) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return session.show(context, image_id, [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] _reraise_translated_image_exception(image_id) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise new_exc.with_traceback(exc_trace) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] result = getattr(controller, method)(*args, **kwargs) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._get(image_id) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] resp, body = self.http_client.get(url, headers=header) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.request(url, 'GET', **kwargs) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._handle_response(resp) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exc.from_response(resp, resp.content) [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1068.391807] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1068.391807] env[67144]: INFO nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Terminating instance [ 1068.392594] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1068.392806] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1068.393519] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1068.393826] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1068.394089] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c13c3ade-71ab-40bc-ae1b-2c6bb1924668 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.398835] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3448fec6-7d07-4c10-b5c9-67efa088e90c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.405514] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1068.405775] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-94a81555-c773-4ac2-b90f-d30addeb0dda {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.408048] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1068.408210] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1068.409152] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa897abd-c8ca-46c9-ab3c-a9897ff3d7c1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.414071] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Waiting for the task: (returnval){ [ 1068.414071] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5284ed06-1b57-75a1-a160-cd430836a5c0" [ 1068.414071] env[67144]: _type = "Task" [ 1068.414071] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1068.421195] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5284ed06-1b57-75a1-a160-cd430836a5c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1068.487799] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1068.488038] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1068.488227] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Deleting the datastore file [datastore1] 0811722e-2ae9-4018-a85d-ab4fe5f46370 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1068.488481] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-96021311-97e8-4ad7-95ed-ddbee5bb6af7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.494751] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Waiting for the task: (returnval){ [ 1068.494751] env[67144]: value = "task-2848110" [ 1068.494751] env[67144]: _type = "Task" [ 1068.494751] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1068.502462] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Task: {'id': task-2848110, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1068.684531] env[67144]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848108, 'name': CreateVM_Task, 'duration_secs': 0.30782} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1068.684707] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Created VM on the ESX host {{(pid=67144) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1068.685432] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1068.685722] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1068.686068] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1068.686309] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a8fa29ec-b8f0-4b75-9eb7-662cc4b2609b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.690540] env[67144]: DEBUG oslo_vmware.api [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Waiting for the task: (returnval){ [ 1068.690540] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]527b888f-2af7-e69a-0057-d3f2ca6c5f07" [ 1068.690540] env[67144]: _type = "Task" [ 1068.690540] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1068.697676] env[67144]: DEBUG oslo_vmware.api [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]527b888f-2af7-e69a-0057-d3f2ca6c5f07, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1068.924784] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1068.925156] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Creating directory with path [datastore1] vmware_temp/b714c9fb-ab78-43f8-9310-b9cc594d5b82/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1068.925263] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df8a12ef-08cd-4f52-a126-fffb4e91fab0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.936760] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Created directory with path [datastore1] vmware_temp/b714c9fb-ab78-43f8-9310-b9cc594d5b82/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1068.936889] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Fetch image to [datastore1] vmware_temp/b714c9fb-ab78-43f8-9310-b9cc594d5b82/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1068.937082] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/b714c9fb-ab78-43f8-9310-b9cc594d5b82/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1068.937790] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc27fcd4-2d13-4349-bd95-0ff4219ef56f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.944537] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0199e8b4-e577-4836-b541-7ab12bca2a3c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.953584] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34f8a50d-ea69-49b5-8e10-254af38356a6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.985402] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e6d002b-d39a-42ac-bce5-c9d161a3cb9b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.991479] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3669cb5b-58db-4f05-8e81-177c195a0932 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.002888] env[67144]: DEBUG oslo_vmware.api [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Task: {'id': task-2848110, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072809} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1069.003180] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1069.003394] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1069.003583] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1069.003762] env[67144]: INFO nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1069.005904] env[67144]: DEBUG nova.compute.claims [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1069.006089] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1069.006304] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1069.010325] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1069.034933] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1069.035717] env[67144]: DEBUG nova.compute.utils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1069.037185] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1069.038092] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1069.038092] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1069.038092] env[67144]: DEBUG nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1069.038092] env[67144]: DEBUG nova.network.neutron [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1069.106505] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1069.107297] env[67144]: ERROR nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] result = getattr(controller, method)(*args, **kwargs) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._get(image_id) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] resp, body = self.http_client.get(url, headers=header) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.request(url, 'GET', **kwargs) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._handle_response(resp) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exc.from_response(resp, resp.content) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] During handling of the above exception, another exception occurred: [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] yield resources [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.driver.spawn(context, instance, image_meta, [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._fetch_image_if_missing(context, vi) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image_fetch(context, vi, tmp_image_ds_loc) [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] images.fetch_image( [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1069.107297] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] metadata = IMAGE_API.get(context, image_ref) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return session.show(context, image_id, [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] _reraise_translated_image_exception(image_id) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise new_exc.with_traceback(exc_trace) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] result = getattr(controller, method)(*args, **kwargs) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._get(image_id) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] resp, body = self.http_client.get(url, headers=header) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.request(url, 'GET', **kwargs) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._handle_response(resp) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exc.from_response(resp, resp.content) [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.108391] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.108391] env[67144]: INFO nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Terminating instance [ 1069.109258] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1069.109472] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1069.110118] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1069.110739] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1069.110739] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b77206bf-8414-461d-9fa6-3b1c6b8b088a {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.113014] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c094e24-4c27-4e86-ab6f-f5d6041daff4 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.120814] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1069.121038] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8c446191-2a47-4877-96e4-fd0e4afcd9c3 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.123281] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1069.123506] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1069.124440] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7161226b-f6f7-4a7f-b0a5-262afdb19c77 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.129763] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Waiting for the task: (returnval){ [ 1069.129763] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]524e6237-1615-e372-b602-9ca1f185e473" [ 1069.129763] env[67144]: _type = "Task" [ 1069.129763] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1069.132835] env[67144]: DEBUG neutronclient.v2_0.client [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1069.134404] env[67144]: ERROR nova.compute.manager [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] result = getattr(controller, method)(*args, **kwargs) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._get(image_id) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] resp, body = self.http_client.get(url, headers=header) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.request(url, 'GET', **kwargs) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._handle_response(resp) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exc.from_response(resp, resp.content) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] During handling of the above exception, another exception occurred: [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.driver.spawn(context, instance, image_meta, [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._fetch_image_if_missing(context, vi) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image_fetch(context, vi, tmp_image_ds_loc) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] images.fetch_image( [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] metadata = IMAGE_API.get(context, image_ref) [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return session.show(context, image_id, [ 1069.134404] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] _reraise_translated_image_exception(image_id) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise new_exc.with_traceback(exc_trace) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] result = getattr(controller, method)(*args, **kwargs) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._get(image_id) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] resp, body = self.http_client.get(url, headers=header) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.request(url, 'GET', **kwargs) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._handle_response(resp) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exc.from_response(resp, resp.content) [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] During handling of the above exception, another exception occurred: [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._build_and_run_instance(context, instance, image, [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] with excutils.save_and_reraise_exception(): [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.force_reraise() [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise self.value [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] with self.rt.instance_claim(context, instance, node, allocs, [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.abort() [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1069.135459] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return f(*args, **kwargs) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._unset_instance_host_and_node(instance) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] instance.save() [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] updates, result = self.indirection_api.object_action( [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return cctxt.call(context, 'object_action', objinst=objinst, [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] result = self.transport._send( [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._driver.send(target, ctxt, message, [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise result [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] nova.exception_Remote.InstanceNotFound_Remote: Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 could not be found. [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return getattr(target, method)(*args, **kwargs) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return fn(self, *args, **kwargs) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] old_ref, inst_ref = db.instance_update_and_get_original( [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return f(*args, **kwargs) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] with excutils.save_and_reraise_exception() as ectxt: [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.force_reraise() [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise self.value [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return f(*args, **kwargs) [ 1069.136571] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return f(context, *args, **kwargs) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exception.InstanceNotFound(instance_id=uuid) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] nova.exception.InstanceNotFound: Instance 0811722e-2ae9-4018-a85d-ab4fe5f46370 could not be found. [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] During handling of the above exception, another exception occurred: [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] exception_handler_v20(status_code, error_body) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise client_exc(message=error_message, [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Neutron server returns request_ids: ['req-1bf9a78f-3dc5-48f8-9b7b-d704401bd7a5'] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] During handling of the above exception, another exception occurred: [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] Traceback (most recent call last): [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._deallocate_network(context, instance, requested_networks) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self.network_api.deallocate_for_instance( [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] data = neutron.list_ports(**search_opts) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.list('ports', self.ports_path, retrieve_all, [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1069.137804] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] for r in self._pagination(collection, path, **params): [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] res = self.get(path, params=params) [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.retry_request("GET", action, body=body, [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] return self.do_request(method, action, body=body, [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] ret = obj(*args, **kwargs) [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] self._handle_fault_response(status_code, replybody, resp) [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] raise exception.Unauthorized() [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] nova.exception.Unauthorized: Not authorized. [ 1069.138970] env[67144]: ERROR nova.compute.manager [instance: 0811722e-2ae9-4018-a85d-ab4fe5f46370] [ 1069.141910] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]524e6237-1615-e372-b602-9ca1f185e473, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1069.156160] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9adcf617-aa50-4c78-980c-784f4f37d86f tempest-ServerMetadataNegativeTestJSON-1083055669 tempest-ServerMetadataNegativeTestJSON-1083055669-project-member] Lock "0811722e-2ae9-4018-a85d-ab4fe5f46370" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 361.177s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1069.202433] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1069.202704] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Processing image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1069.202956] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9cc2ca13-bd5c-4aa5-8aff-2c6a7a1e414e tempest-ServerRescueNegativeTestJSON-105378752 tempest-ServerRescueNegativeTestJSON-105378752-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1069.203692] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1069.203879] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1069.204060] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Deleting the datastore file [datastore1] 42ce3afe-e725-4688-b048-bd6721c22c35 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1069.204297] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8c24ce0e-83e8-48e1-8d3f-4eebf3cf5954 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.210581] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Waiting for the task: (returnval){ [ 1069.210581] env[67144]: value = "task-2848112" [ 1069.210581] env[67144]: _type = "Task" [ 1069.210581] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1069.218609] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Task: {'id': task-2848112, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1069.639858] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1069.640127] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Creating directory with path [datastore1] vmware_temp/0c9a4a39-f255-4f38-a5d1-18aad29358ba/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1069.640358] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fcc133ff-6358-4a41-a291-b59b84443b5d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.651305] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Created directory with path [datastore1] vmware_temp/0c9a4a39-f255-4f38-a5d1-18aad29358ba/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1069.651485] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Fetch image to [datastore1] vmware_temp/0c9a4a39-f255-4f38-a5d1-18aad29358ba/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1069.651653] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/0c9a4a39-f255-4f38-a5d1-18aad29358ba/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1069.652354] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a68ab2a9-3712-470c-a05d-0f5022c6f1d6 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.658887] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1889faa5-0396-41b2-9260-db4b1eb050d5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.668253] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2874c1c2-6bd3-4948-a002-2638be0fc548 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.699875] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3852ed8f-af80-4019-8ae4-8a7fbfd25d03 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.705907] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2dbd8e97-9444-46dd-9ddc-58f30db246dd {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.719580] env[67144]: DEBUG oslo_vmware.api [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Task: {'id': task-2848112, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069245} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1069.719807] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1069.719997] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1069.720182] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1069.720383] env[67144]: INFO nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1069.722496] env[67144]: DEBUG nova.compute.claims [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1069.722680] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1069.722895] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1069.726820] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1069.749453] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1069.750120] env[67144]: DEBUG nova.compute.utils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance 42ce3afe-e725-4688-b048-bd6721c22c35 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1069.751106] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1069.751281] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1069.751439] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1069.751607] env[67144]: DEBUG nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1069.751768] env[67144]: DEBUG nova.network.neutron [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1069.781591] env[67144]: DEBUG nova.compute.manager [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Received event network-changed-2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1069.781795] env[67144]: DEBUG nova.compute.manager [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Refreshing instance network info cache due to event network-changed-2e172af0-911b-4289-9b23-86e83386c66e. {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1069.782029] env[67144]: DEBUG oslo_concurrency.lockutils [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] Acquiring lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1069.782179] env[67144]: DEBUG oslo_concurrency.lockutils [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] Acquired lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1069.782339] env[67144]: DEBUG nova.network.neutron [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Refreshing network info cache for port 2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1069.820718] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1069.821531] env[67144]: ERROR nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] result = getattr(controller, method)(*args, **kwargs) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._get(image_id) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] resp, body = self.http_client.get(url, headers=header) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.request(url, 'GET', **kwargs) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._handle_response(resp) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exc.from_response(resp, resp.content) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] During handling of the above exception, another exception occurred: [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] yield resources [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.driver.spawn(context, instance, image_meta, [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._fetch_image_if_missing(context, vi) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image_fetch(context, vi, tmp_image_ds_loc) [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] images.fetch_image( [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1069.821531] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] metadata = IMAGE_API.get(context, image_ref) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return session.show(context, image_id, [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] _reraise_translated_image_exception(image_id) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise new_exc.with_traceback(exc_trace) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] result = getattr(controller, method)(*args, **kwargs) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._get(image_id) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] resp, body = self.http_client.get(url, headers=header) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.request(url, 'GET', **kwargs) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._handle_response(resp) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exc.from_response(resp, resp.content) [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.822951] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1069.822951] env[67144]: INFO nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Terminating instance [ 1069.823872] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1069.823872] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1069.824075] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1069.824278] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1069.824507] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc9b65f4-182a-4a5b-90bb-4413e2cf5609 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.827491] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b0c5ba7-f96a-47ef-8252-7cefd5180e39 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.834536] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1069.834751] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-01297691-9213-4d39-90e7-8018d65844e9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.837140] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1069.837317] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1069.838248] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f834f62-3f69-425b-a430-b2cf2b04927e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.843192] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Waiting for the task: (returnval){ [ 1069.843192] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526fc1ea-c351-431c-9875-77f0ceb130af" [ 1069.843192] env[67144]: _type = "Task" [ 1069.843192] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1069.850926] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]526fc1ea-c351-431c-9875-77f0ceb130af, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1069.851575] env[67144]: DEBUG neutronclient.v2_0.client [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1069.853011] env[67144]: ERROR nova.compute.manager [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] result = getattr(controller, method)(*args, **kwargs) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._get(image_id) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] resp, body = self.http_client.get(url, headers=header) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.request(url, 'GET', **kwargs) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._handle_response(resp) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exc.from_response(resp, resp.content) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] During handling of the above exception, another exception occurred: [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.driver.spawn(context, instance, image_meta, [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._fetch_image_if_missing(context, vi) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image_fetch(context, vi, tmp_image_ds_loc) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] images.fetch_image( [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] metadata = IMAGE_API.get(context, image_ref) [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return session.show(context, image_id, [ 1069.853011] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] _reraise_translated_image_exception(image_id) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise new_exc.with_traceback(exc_trace) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] result = getattr(controller, method)(*args, **kwargs) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._get(image_id) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] resp, body = self.http_client.get(url, headers=header) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.request(url, 'GET', **kwargs) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._handle_response(resp) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exc.from_response(resp, resp.content) [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] During handling of the above exception, another exception occurred: [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._build_and_run_instance(context, instance, image, [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] with excutils.save_and_reraise_exception(): [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.force_reraise() [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise self.value [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] with self.rt.instance_claim(context, instance, node, allocs, [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.abort() [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1069.854111] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return f(*args, **kwargs) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._unset_instance_host_and_node(instance) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] instance.save() [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] updates, result = self.indirection_api.object_action( [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return cctxt.call(context, 'object_action', objinst=objinst, [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] result = self.transport._send( [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._driver.send(target, ctxt, message, [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise result [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] nova.exception_Remote.InstanceNotFound_Remote: Instance 42ce3afe-e725-4688-b048-bd6721c22c35 could not be found. [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return getattr(target, method)(*args, **kwargs) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return fn(self, *args, **kwargs) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] old_ref, inst_ref = db.instance_update_and_get_original( [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return f(*args, **kwargs) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] with excutils.save_and_reraise_exception() as ectxt: [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.force_reraise() [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise self.value [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return f(*args, **kwargs) [ 1069.855275] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return f(context, *args, **kwargs) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exception.InstanceNotFound(instance_id=uuid) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] nova.exception.InstanceNotFound: Instance 42ce3afe-e725-4688-b048-bd6721c22c35 could not be found. [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] During handling of the above exception, another exception occurred: [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] exception_handler_v20(status_code, error_body) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise client_exc(message=error_message, [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Neutron server returns request_ids: ['req-e97d38d0-08f7-4fb0-9d43-4e84c32ac6d7'] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] During handling of the above exception, another exception occurred: [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] Traceback (most recent call last): [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._deallocate_network(context, instance, requested_networks) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self.network_api.deallocate_for_instance( [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] data = neutron.list_ports(**search_opts) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.list('ports', self.ports_path, retrieve_all, [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1069.856804] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] for r in self._pagination(collection, path, **params): [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] res = self.get(path, params=params) [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.retry_request("GET", action, body=body, [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] return self.do_request(method, action, body=body, [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] ret = obj(*args, **kwargs) [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] self._handle_fault_response(status_code, replybody, resp) [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] raise exception.Unauthorized() [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] nova.exception.Unauthorized: Not authorized. [ 1069.858115] env[67144]: ERROR nova.compute.manager [instance: 42ce3afe-e725-4688-b048-bd6721c22c35] [ 1069.873589] env[67144]: DEBUG oslo_concurrency.lockutils [None req-da1c6b11-f559-4ea3-af5d-629e7ec57394 tempest-AttachVolumeNegativeTest-1526345894 tempest-AttachVolumeNegativeTest-1526345894-project-member] Lock "42ce3afe-e725-4688-b048-bd6721c22c35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 358.714s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1069.894696] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1069.894859] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1069.895046] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Deleting the datastore file [datastore1] c3621484-8333-4375-9700-62b08d90887f {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1069.895304] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9eaf0543-8496-40c5-8f4c-bc223f62cb00 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.902632] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Waiting for the task: (returnval){ [ 1069.902632] env[67144]: value = "task-2848114" [ 1069.902632] env[67144]: _type = "Task" [ 1069.902632] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1069.912303] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Task: {'id': task-2848114, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1070.060782] env[67144]: DEBUG nova.network.neutron [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Updated VIF entry in instance network info cache for port 2e172af0-911b-4289-9b23-86e83386c66e. {{(pid=67144) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1070.061103] env[67144]: DEBUG nova.network.neutron [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Updating instance_info_cache with network_info: [{"id": "2e172af0-911b-4289-9b23-86e83386c66e", "address": "fa:16:3e:28:28:2c", "network": {"id": "0fa1e319-a38f-41b9-868f-3f758185cf0c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-525376206-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3902eab35278495e87c590a781241e16", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2c06e3c2-8edb-4cf0-be6b-45dfe059c00b", "external-id": "nsx-vlan-transportzone-264", "segmentation_id": 264, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e172af0-91", "ovs_interfaceid": "2e172af0-911b-4289-9b23-86e83386c66e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1070.070488] env[67144]: DEBUG oslo_concurrency.lockutils [req-47d16795-cbd8-43c1-be8b-a2edac373956 req-2022bafc-68e4-41b1-a241-dbe259a937d6 service nova] Releasing lock "refresh_cache-842426aa-72a3-4604-b50b-9705b55ea396" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1070.352896] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1070.353174] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Creating directory with path [datastore1] vmware_temp/b1db0eb3-f2d6-4de2-8fb6-cbe3d9df6ada/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1070.353414] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52694a2e-a620-4e4d-a735-9f7dfd3563e7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.364547] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Created directory with path [datastore1] vmware_temp/b1db0eb3-f2d6-4de2-8fb6-cbe3d9df6ada/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1070.364732] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Fetch image to [datastore1] vmware_temp/b1db0eb3-f2d6-4de2-8fb6-cbe3d9df6ada/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1070.364906] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/b1db0eb3-f2d6-4de2-8fb6-cbe3d9df6ada/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1070.365640] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6dafff0-212a-4f9f-a563-54cd7e63144e {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.373147] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0569f4ed-c4f3-4ce0-8170-d13fd10db88d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.381671] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b4abf81-4d05-4785-8586-b3a76210da6b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.413775] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dceab7c5-ab3a-4da0-b182-008587bc4d91 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.421552] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-da97e4c7-3013-46ea-a044-516da9d04772 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.423162] env[67144]: DEBUG oslo_vmware.api [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Task: {'id': task-2848114, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074089} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1070.423386] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1070.423560] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1070.423728] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1070.423899] env[67144]: INFO nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1070.425950] env[67144]: DEBUG nova.compute.claims [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1070.426203] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1070.426470] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1070.443264] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1070.450813] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1070.451483] env[67144]: DEBUG nova.compute.utils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance c3621484-8333-4375-9700-62b08d90887f could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1070.452923] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1070.453112] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1070.453277] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1070.453442] env[67144]: DEBUG nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1070.453631] env[67144]: DEBUG nova.network.neutron [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1070.477908] env[67144]: DEBUG neutronclient.v2_0.client [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67144) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1070.479410] env[67144]: ERROR nova.compute.manager [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] [instance: c3621484-8333-4375-9700-62b08d90887f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] result = getattr(controller, method)(*args, **kwargs) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._get(image_id) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] resp, body = self.http_client.get(url, headers=header) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.request(url, 'GET', **kwargs) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._handle_response(resp) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exc.from_response(resp, resp.content) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] During handling of the above exception, another exception occurred: [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.driver.spawn(context, instance, image_meta, [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._fetch_image_if_missing(context, vi) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image_fetch(context, vi, tmp_image_ds_loc) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] images.fetch_image( [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] metadata = IMAGE_API.get(context, image_ref) [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return session.show(context, image_id, [ 1070.479410] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] _reraise_translated_image_exception(image_id) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise new_exc.with_traceback(exc_trace) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] result = getattr(controller, method)(*args, **kwargs) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._get(image_id) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] resp, body = self.http_client.get(url, headers=header) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.request(url, 'GET', **kwargs) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._handle_response(resp) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exc.from_response(resp, resp.content) [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] During handling of the above exception, another exception occurred: [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._build_and_run_instance(context, instance, image, [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] with excutils.save_and_reraise_exception(): [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.force_reraise() [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise self.value [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] with self.rt.instance_claim(context, instance, node, allocs, [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.abort() [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1070.480472] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return f(*args, **kwargs) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._unset_instance_host_and_node(instance) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] instance.save() [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] updates, result = self.indirection_api.object_action( [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] result = self.transport._send( [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._driver.send(target, ctxt, message, [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise result [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] nova.exception_Remote.InstanceNotFound_Remote: Instance c3621484-8333-4375-9700-62b08d90887f could not be found. [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return getattr(target, method)(*args, **kwargs) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return fn(self, *args, **kwargs) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return f(*args, **kwargs) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] with excutils.save_and_reraise_exception() as ectxt: [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.force_reraise() [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise self.value [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return f(*args, **kwargs) [ 1070.481664] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return f(context, *args, **kwargs) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exception.InstanceNotFound(instance_id=uuid) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] nova.exception.InstanceNotFound: Instance c3621484-8333-4375-9700-62b08d90887f could not be found. [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] During handling of the above exception, another exception occurred: [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] exception_handler_v20(status_code, error_body) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise client_exc(message=error_message, [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Neutron server returns request_ids: ['req-5cacfeb5-505e-4d3d-9b76-954dfea1a253'] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] During handling of the above exception, another exception occurred: [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] Traceback (most recent call last): [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._deallocate_network(context, instance, requested_networks) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self.network_api.deallocate_for_instance( [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] data = neutron.list_ports(**search_opts) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.list('ports', self.ports_path, retrieve_all, [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1070.482734] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] for r in self._pagination(collection, path, **params): [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] res = self.get(path, params=params) [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.retry_request("GET", action, body=body, [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] return self.do_request(method, action, body=body, [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] ret = obj(*args, **kwargs) [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] self._handle_fault_response(status_code, replybody, resp) [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] raise exception.Unauthorized() [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] nova.exception.Unauthorized: Not authorized. [ 1070.483865] env[67144]: ERROR nova.compute.manager [instance: c3621484-8333-4375-9700-62b08d90887f] [ 1070.499354] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a0c362ec-dcd3-43bf-ada2-65acee7b6f0c tempest-ServerActionsTestOtherA-1771457743 tempest-ServerActionsTestOtherA-1771457743-project-member] Lock "c3621484-8333-4375-9700-62b08d90887f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 355.022s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1070.537120] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1070.537968] env[67144]: ERROR nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Traceback (most recent call last): [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] result = getattr(controller, method)(*args, **kwargs) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self._get(image_id) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] resp, body = self.http_client.get(url, headers=header) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self.request(url, 'GET', **kwargs) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self._handle_response(resp) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] raise exc.from_response(resp, resp.content) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] During handling of the above exception, another exception occurred: [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Traceback (most recent call last): [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] yield resources [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] self.driver.spawn(context, instance, image_meta, [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] self._fetch_image_if_missing(context, vi) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] image_fetch(context, vi, tmp_image_ds_loc) [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] images.fetch_image( [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1070.537968] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] metadata = IMAGE_API.get(context, image_ref) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return session.show(context, image_id, [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] _reraise_translated_image_exception(image_id) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] raise new_exc.with_traceback(exc_trace) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] result = getattr(controller, method)(*args, **kwargs) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self._get(image_id) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] resp, body = self.http_client.get(url, headers=header) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self.request(url, 'GET', **kwargs) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] return self._handle_response(resp) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] raise exc.from_response(resp, resp.content) [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] nova.exception.ImageNotAuthorized: Not authorized for image 0a8f8f2e-82dd-4c4f-80fe-9515de315a84. [ 1070.538928] env[67144]: ERROR nova.compute.manager [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] [ 1070.538928] env[67144]: INFO nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Terminating instance [ 1070.539859] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1070.540085] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1070.540579] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1070.540758] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquired lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1070.540928] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1070.541852] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a981b9a-d137-4263-aa8d-fe32e283bdd7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.547819] env[67144]: DEBUG nova.compute.utils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Can not refresh info_cache because instance was not found {{(pid=67144) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1070.550992] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1070.551183] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1070.552576] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-426a597d-3add-402e-b38f-734d8a61b83d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.561504] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 1070.561504] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]5208751b-99a8-7597-c097-503cd946c8a3" [ 1070.561504] env[67144]: _type = "Task" [ 1070.561504] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.566332] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1070.575217] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1070.575452] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating directory with path [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1070.575690] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0e847bd-701e-4bc0-be1f-20e338eaf0c0 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.594591] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Created directory with path [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1070.594820] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Fetch image to [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1070.594998] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1070.595812] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb67cf1-7840-4555-820e-f1426f25c619 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.604828] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb2aa27-550b-4032-9b68-b8f0fc4292a5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.614825] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe493f6-7b1a-4131-8ce3-37f72b07c34b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.645256] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d0c985-6406-4e5a-a3a8-fbac00bfaff7 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.648826] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1070.651581] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-67d73473-69d8-4c18-9dcd-fd884948e0a1 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.658444] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Releasing lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1070.658825] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1070.659023] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1070.660042] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eeba6ff-79ec-4216-8c72-c202bbfe0507 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.666998] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1070.667427] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8974eb76-c1ae-4257-a7d0-d2d161a631ca {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.671852] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1070.689293] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1070.689564] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1070.689742] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Deleting the datastore file [datastore1] b932a680-76a5-4f08-ac38-2fc1578b4a86 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1070.690016] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e6a4d7a3-849c-440c-901c-cc9408637480 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.697213] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Waiting for the task: (returnval){ [ 1070.697213] env[67144]: value = "task-2848116" [ 1070.697213] env[67144]: _type = "Task" [ 1070.697213] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.704788] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Task: {'id': task-2848116, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1070.720527] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1070.779213] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1070.779432] env[67144]: DEBUG oslo_vmware.rw_handles [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1071.207741] env[67144]: DEBUG oslo_vmware.api [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Task: {'id': task-2848116, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035813} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1071.208090] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1071.208239] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1071.208451] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1071.208658] env[67144]: INFO nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Took 0.55 seconds to destroy the instance on the hypervisor. [ 1071.208920] env[67144]: DEBUG oslo.service.loopingcall [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67144) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1071.209178] env[67144]: DEBUG nova.compute.manager [-] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Skipping network deallocation for instance since networking was not requested. {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1071.211351] env[67144]: DEBUG nova.compute.claims [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1071.211539] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1071.211787] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1071.236416] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1071.237188] env[67144]: DEBUG nova.compute.utils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance b932a680-76a5-4f08-ac38-2fc1578b4a86 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1071.238619] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1071.238820] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1071.239090] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquiring lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1071.239270] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Acquired lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1071.239447] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Building network info cache for instance {{(pid=67144) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1071.246311] env[67144]: DEBUG nova.compute.utils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Can not refresh info_cache because instance was not found {{(pid=67144) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1071.262823] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Instance cache missing network info. {{(pid=67144) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1071.319730] env[67144]: DEBUG nova.network.neutron [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1071.329084] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Releasing lock "refresh_cache-b932a680-76a5-4f08-ac38-2fc1578b4a86" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1071.329331] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1071.329517] env[67144]: DEBUG nova.compute.manager [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] [instance: b932a680-76a5-4f08-ac38-2fc1578b4a86] Skipping network deallocation for instance since networking was not requested. {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1071.371601] env[67144]: DEBUG oslo_concurrency.lockutils [None req-05968818-acdb-4ff2-a0d3-b0768075b102 tempest-ServerShowV247Test-925015620 tempest-ServerShowV247Test-925015620-project-member] Lock "b932a680-76a5-4f08-ac38-2fc1578b4a86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 349.562s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1115.417064] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.427268] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1115.427628] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1115.427905] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1115.428265] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67144) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1115.429966] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a98e6e2-823d-4275-b89e-59c8dde751cc {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.442178] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58b6169f-4f8e-4bf3-8542-61e687aa125d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.458717] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e673410e-364b-424e-be77-4d6372590c7d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.464934] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9caaaeb9-0679-4bc4-905d-295cfee9a5c5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.493604] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=67144) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1115.493828] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1115.494127] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1115.531554] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Instance 842426aa-72a3-4604-b50b-9705b55ea396 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67144) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1115.531770] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1115.531917] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=67144) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1115.559066] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3672d51f-4883-4b18-b311-b70891d531af {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.565727] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c1b31de-bdd1-4886-b0ba-d9cd2c79147c {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.595926] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e1cc0f-f725-4153-afe7-1af7f123a6b5 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.602403] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38f84df2-bda2-4237-9ab3-41e73e82c1e8 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.614714] env[67144]: DEBUG nova.compute.provider_tree [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed in ProviderTree for provider: 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 {{(pid=67144) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1115.622535] env[67144]: DEBUG nova.scheduler.client.report [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Inventory has not changed for provider 0828d7f8-c9f7-4ab8-bc5e-c8712ec6bae8 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67144) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1115.634796] env[67144]: DEBUG nova.compute.resource_tracker [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67144) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1115.634966] env[67144]: DEBUG oslo_concurrency.lockutils [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1116.634582] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.417456] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.417731] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Starting heal instance info cache {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1117.417802] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Rebuilding the list of instances to heal {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1117.428075] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Skipping network cache update for instance because it is Building. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1117.428252] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Didn't find any instances for network info cache update. {{(pid=67144) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1117.428417] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.428793] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1118.417166] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1118.477025] env[67144]: WARNING oslo_vmware.rw_handles [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles response.begin() [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1118.477025] env[67144]: ERROR oslo_vmware.rw_handles [ 1118.477451] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Downloaded image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1118.479276] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Caching image {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1118.479539] env[67144]: DEBUG nova.virt.vmwareapi.vm_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Copying Virtual Disk [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk to [datastore1] vmware_temp/1c835e66-9584-4272-9629-035c746be721/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk {{(pid=67144) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1118.479821] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-92ab9d09-6cdf-408b-be18-d362fea2a7f9 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.487278] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 1118.487278] env[67144]: value = "task-2848117" [ 1118.487278] env[67144]: _type = "Task" [ 1118.487278] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1118.494995] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848117, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1118.998127] env[67144]: DEBUG oslo_vmware.exceptions [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Fault InvalidArgument not matched. {{(pid=67144) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1118.998348] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1118.998881] env[67144]: ERROR nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1118.998881] env[67144]: Faults: ['InvalidArgument'] [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Traceback (most recent call last): [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] yield resources [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] self.driver.spawn(context, instance, image_meta, [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] self._fetch_image_if_missing(context, vi) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] image_cache(vi, tmp_image_ds_loc) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] vm_util.copy_virtual_disk( [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] session._wait_for_task(vmdk_copy_task) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] return self.wait_for_task(task_ref) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] return evt.wait() [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] result = hub.switch() [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] return self.greenlet.switch() [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] self.f(*self.args, **self.kw) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] raise exceptions.translate_fault(task_info.error) [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Faults: ['InvalidArgument'] [ 1118.998881] env[67144]: ERROR nova.compute.manager [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] [ 1118.999900] env[67144]: INFO nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Terminating instance [ 1119.000782] env[67144]: DEBUG oslo_concurrency.lockutils [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/0a8f8f2e-82dd-4c4f-80fe-9515de315a84.vmdk" {{(pid=67144) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1119.001008] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1119.001243] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27af89f7-274d-4a66-816b-1176c0aedadf {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.003351] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Start destroying the instance on the hypervisor. {{(pid=67144) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1119.003546] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Destroying instance {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1119.004249] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82c33ede-f395-4fe4-bf1e-14e29dac180b {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.010705] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Unregistering the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1119.010907] env[67144]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71184539-7b89-43b5-a7ad-a79e955d683d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.012960] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1119.013146] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67144) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1119.014037] env[67144]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4aa5aa77-20af-4c5c-862e-79fa55500311 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.018460] env[67144]: DEBUG oslo_vmware.api [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Waiting for the task: (returnval){ [ 1119.018460] env[67144]: value = "session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529116f5-a678-e43c-96d3-cd936259703d" [ 1119.018460] env[67144]: _type = "Task" [ 1119.018460] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1119.026551] env[67144]: DEBUG oslo_vmware.api [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Task: {'id': session[52dd31bf-746b-ea79-40ca-18c1a0ecd870]529116f5-a678-e43c-96d3-cd936259703d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1119.086886] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Unregistered the VM {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1119.087220] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Deleting contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1119.087466] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Deleting the datastore file [datastore1] 670f3974-b332-48c2-9aab-6a9ed01731b7 {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1119.087793] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6e79371c-f0e1-437a-a550-82cf1dfd2853 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.094688] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Waiting for the task: (returnval){ [ 1119.094688] env[67144]: value = "task-2848119" [ 1119.094688] env[67144]: _type = "Task" [ 1119.094688] env[67144]: } to complete. {{(pid=67144) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1119.104590] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848119, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1119.411570] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.416195] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.416359] env[67144]: DEBUG nova.compute.manager [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67144) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1119.528131] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Preparing fetch location {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1119.528472] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Creating directory with path [datastore1] vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1119.528592] env[67144]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-298c79db-35c0-456d-867d-4d95d36b7fdb {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.539365] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Created directory with path [datastore1] vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84 {{(pid=67144) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1119.539557] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Fetch image to [datastore1] vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk {{(pid=67144) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1119.539726] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to [datastore1] vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk on the data store datastore1 {{(pid=67144) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1119.540401] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95e39426-d077-4075-9aec-e64c61baf150 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.546623] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d25517d-ca4c-4072-9746-f6d1074f7f3d {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.555152] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c791471-bfbb-4078-bd97-8083d9f8b37f {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.584648] env[67144]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de092246-b11c-4295-b1d1-cf74a0d06ce2 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.589882] env[67144]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f126c3be-5882-4557-8ae0-336228c5a592 {{(pid=67144) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.602561] env[67144]: DEBUG oslo_vmware.api [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Task: {'id': task-2848119, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062901} completed successfully. {{(pid=67144) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1119.602775] env[67144]: DEBUG nova.virt.vmwareapi.ds_util [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Deleted the datastore file {{(pid=67144) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1119.602951] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Deleted contents of the VM from datastore datastore1 {{(pid=67144) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1119.603144] env[67144]: DEBUG nova.virt.vmwareapi.vmops [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance destroyed {{(pid=67144) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1119.603319] env[67144]: INFO nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1119.605346] env[67144]: DEBUG nova.compute.claims [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Aborting claim: {{(pid=67144) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1119.605527] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1119.605737] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1119.613109] env[67144]: DEBUG nova.virt.vmwareapi.images [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] [instance: 3ce17a5c-b299-4c3e-8ccd-5587da8a4b2f] Downloading image file data 0a8f8f2e-82dd-4c4f-80fe-9515de315a84 to the data store datastore1 {{(pid=67144) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1119.629792] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1119.630452] env[67144]: DEBUG nova.compute.utils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance 670f3974-b332-48c2-9aab-6a9ed01731b7 could not be found. {{(pid=67144) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1119.631850] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Instance disappeared during build. {{(pid=67144) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1119.632033] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Unplugging VIFs for instance {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1119.632201] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67144) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1119.632351] env[67144]: DEBUG nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Deallocating network for instance {{(pid=67144) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1119.632516] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] deallocate_for_instance() {{(pid=67144) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1119.659162] env[67144]: DEBUG nova.network.neutron [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Updating instance_info_cache with network_info: [] {{(pid=67144) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.660680] env[67144]: DEBUG oslo_vmware.rw_handles [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1119.714267] env[67144]: INFO nova.compute.manager [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] [instance: 670f3974-b332-48c2-9aab-6a9ed01731b7] Took 0.08 seconds to deallocate network for instance. [ 1119.718315] env[67144]: DEBUG oslo_vmware.rw_handles [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Completed reading data from the image iterator. {{(pid=67144) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1119.718476] env[67144]: DEBUG oslo_vmware.rw_handles [None req-9b567b92-4a19-4635-aa0a-2ca9275841cf tempest-InstanceActionsNegativeTestJSON-2120890364 tempest-InstanceActionsNegativeTestJSON-2120890364-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7cdb8b04-e9eb-4699-a37f-7a82205b99c6/0a8f8f2e-82dd-4c4f-80fe-9515de315a84/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67144) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1119.754501] env[67144]: DEBUG oslo_concurrency.lockutils [None req-a3d92d93-5c02-431b-a98e-b2c8abcb9eca tempest-DeleteServersAdminTestJSON-1141988955 tempest-DeleteServersAdminTestJSON-1141988955-project-member] Lock "670f3974-b332-48c2-9aab-6a9ed01731b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 299.255s {{(pid=67144) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1120.417213] env[67144]: DEBUG oslo_service.periodic_task [None req-ef4a12b5-fc68-4ebb-b0a7-847046ad91a6 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67144) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1135.568767] env[67144]: DEBUG nova.compute.manager [req-2faada8f-967e-497f-98d1-f18f10662028 req-efa166f2-2d57-4650-8cd9-a7bcc8fa1ee7 service nova] [instance: 842426aa-72a3-4604-b50b-9705b55ea396] Received event network-vif-deleted-2e172af0-911b-4289-9b23-86e83386c66e {{(pid=67144) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}}